00:00:00.001 Started by upstream project "autotest-spdk-v24.09-vs-dpdk-v23.11" build number 136 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3637 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.001 Started by timer 00:00:00.039 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.040 The recommended git tool is: git 00:00:00.041 using credential 00000000-0000-0000-0000-000000000002 00:00:00.044 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.060 Fetching changes from the remote Git repository 00:00:00.065 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.085 Using shallow fetch with depth 1 00:00:00.085 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.085 > git --version # timeout=10 00:00:00.108 > git --version # 'git version 2.39.2' 00:00:00.108 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.143 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.143 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.289 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.299 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.308 Checking out Revision b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf (FETCH_HEAD) 00:00:02.308 > git config core.sparsecheckout # timeout=10 00:00:02.318 > git read-tree -mu HEAD # timeout=10 00:00:02.333 > git checkout -f b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf # timeout=5 00:00:02.350 Commit message: "jenkins/jjb-config: Ignore OS version mismatch under freebsd" 00:00:02.350 > git rev-list --no-walk b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf # timeout=10 00:00:02.575 [Pipeline] Start of Pipeline 00:00:02.591 [Pipeline] library 00:00:02.593 Loading library shm_lib@master 00:00:02.594 Library shm_lib@master is cached. Copying from home. 00:00:02.613 [Pipeline] node 00:00:02.644 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:02.647 [Pipeline] { 00:00:02.656 [Pipeline] catchError 00:00:02.657 [Pipeline] { 00:00:02.669 [Pipeline] wrap 00:00:02.678 [Pipeline] { 00:00:02.687 [Pipeline] stage 00:00:02.689 [Pipeline] { (Prologue) 00:00:02.948 [Pipeline] sh 00:00:03.237 + logger -p user.info -t JENKINS-CI 00:00:03.256 [Pipeline] echo 00:00:03.257 Node: WFP20 00:00:03.266 [Pipeline] sh 00:00:03.578 [Pipeline] setCustomBuildProperty 00:00:03.592 [Pipeline] echo 00:00:03.594 Cleanup processes 00:00:03.600 [Pipeline] sh 00:00:03.887 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:03.887 847738 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:03.899 [Pipeline] sh 00:00:04.188 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.188 ++ grep -v 'sudo pgrep' 00:00:04.188 ++ awk '{print $1}' 00:00:04.188 + sudo kill -9 00:00:04.188 + true 00:00:04.201 [Pipeline] cleanWs 00:00:04.210 [WS-CLEANUP] Deleting project workspace... 00:00:04.210 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.217 [WS-CLEANUP] done 00:00:04.221 [Pipeline] setCustomBuildProperty 00:00:04.234 [Pipeline] sh 00:00:04.517 + sudo git config --global --replace-all safe.directory '*' 00:00:04.599 [Pipeline] httpRequest 00:00:05.083 [Pipeline] echo 00:00:05.085 Sorcerer 10.211.164.20 is alive 00:00:05.093 [Pipeline] retry 00:00:05.096 [Pipeline] { 00:00:05.110 [Pipeline] httpRequest 00:00:05.114 HttpMethod: GET 00:00:05.114 URL: http://10.211.164.20/packages/jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:05.115 Sending request to url: http://10.211.164.20/packages/jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:05.123 Response Code: HTTP/1.1 200 OK 00:00:05.123 Success: Status code 200 is in the accepted range: 200,404 00:00:05.124 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:05.591 [Pipeline] } 00:00:05.608 [Pipeline] // retry 00:00:05.616 [Pipeline] sh 00:00:05.901 + tar --no-same-owner -xf jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:05.917 [Pipeline] httpRequest 00:00:06.742 [Pipeline] echo 00:00:06.743 Sorcerer 10.211.164.20 is alive 00:00:06.751 [Pipeline] retry 00:00:06.752 [Pipeline] { 00:00:06.763 [Pipeline] httpRequest 00:00:06.768 HttpMethod: GET 00:00:06.768 URL: http://10.211.164.20/packages/spdk_b18e1bd6297ec2f89ab275de3193457af1c946df.tar.gz 00:00:06.769 Sending request to url: http://10.211.164.20/packages/spdk_b18e1bd6297ec2f89ab275de3193457af1c946df.tar.gz 00:00:06.783 Response Code: HTTP/1.1 200 OK 00:00:06.783 Success: Status code 200 is in the accepted range: 200,404 00:00:06.783 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_b18e1bd6297ec2f89ab275de3193457af1c946df.tar.gz 00:01:23.255 [Pipeline] } 00:01:23.272 [Pipeline] // retry 00:01:23.279 [Pipeline] sh 00:01:23.566 + tar --no-same-owner -xf spdk_b18e1bd6297ec2f89ab275de3193457af1c946df.tar.gz 00:01:26.137 [Pipeline] sh 00:01:26.453 + git -C spdk log --oneline -n5 00:01:26.453 b18e1bd62 version: v24.09.1-pre 00:01:26.453 19524ad45 version: v24.09 00:01:26.453 9756b40a3 dpdk: update submodule to include alarm_cancel fix 00:01:26.453 a808500d2 test/nvmf: disable nvmf_shutdown_tc4 on e810 00:01:26.453 3024272c6 bdev/nvme: take nvme_ctrlr.mutex when setting keys 00:01:26.472 [Pipeline] withCredentials 00:01:26.483 > git --version # timeout=10 00:01:26.496 > git --version # 'git version 2.39.2' 00:01:26.512 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:26.514 [Pipeline] { 00:01:26.522 [Pipeline] retry 00:01:26.524 [Pipeline] { 00:01:26.538 [Pipeline] sh 00:01:26.824 + git ls-remote http://dpdk.org/git/dpdk-stable v23.11 00:01:26.836 [Pipeline] } 00:01:26.858 [Pipeline] // retry 00:01:26.862 [Pipeline] } 00:01:26.878 [Pipeline] // withCredentials 00:01:26.887 [Pipeline] httpRequest 00:01:27.271 [Pipeline] echo 00:01:27.273 Sorcerer 10.211.164.20 is alive 00:01:27.284 [Pipeline] retry 00:01:27.287 [Pipeline] { 00:01:27.302 [Pipeline] httpRequest 00:01:27.307 HttpMethod: GET 00:01:27.307 URL: http://10.211.164.20/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:27.308 Sending request to url: http://10.211.164.20/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:27.311 Response Code: HTTP/1.1 200 OK 00:01:27.312 Success: Status code 200 is in the accepted range: 200,404 00:01:27.312 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:33.899 [Pipeline] } 00:01:33.920 [Pipeline] // retry 00:01:33.929 [Pipeline] sh 00:01:34.218 + tar --no-same-owner -xf dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:35.611 [Pipeline] sh 00:01:35.899 + git -C dpdk log --oneline -n5 00:01:35.899 eeb0605f11 version: 23.11.0 00:01:35.899 238778122a doc: update release notes for 23.11 00:01:35.899 46aa6b3cfc doc: fix description of RSS features 00:01:35.899 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:01:35.899 7e421ae345 devtools: support skipping forbid rule check 00:01:35.909 [Pipeline] } 00:01:35.923 [Pipeline] // stage 00:01:35.933 [Pipeline] stage 00:01:35.936 [Pipeline] { (Prepare) 00:01:35.957 [Pipeline] writeFile 00:01:35.975 [Pipeline] sh 00:01:36.260 + logger -p user.info -t JENKINS-CI 00:01:36.271 [Pipeline] sh 00:01:36.551 + logger -p user.info -t JENKINS-CI 00:01:36.565 [Pipeline] sh 00:01:36.855 + cat autorun-spdk.conf 00:01:36.855 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:36.855 SPDK_TEST_FUZZER_SHORT=1 00:01:36.855 SPDK_TEST_FUZZER=1 00:01:36.855 SPDK_TEST_SETUP=1 00:01:36.855 SPDK_RUN_UBSAN=1 00:01:36.856 SPDK_TEST_NATIVE_DPDK=v23.11 00:01:36.856 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:36.863 RUN_NIGHTLY=1 00:01:36.882 [Pipeline] readFile 00:01:36.908 [Pipeline] withEnv 00:01:36.910 [Pipeline] { 00:01:36.922 [Pipeline] sh 00:01:37.209 + set -ex 00:01:37.209 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:37.209 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:37.209 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:37.209 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:37.209 ++ SPDK_TEST_FUZZER=1 00:01:37.209 ++ SPDK_TEST_SETUP=1 00:01:37.209 ++ SPDK_RUN_UBSAN=1 00:01:37.209 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:37.209 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:37.209 ++ RUN_NIGHTLY=1 00:01:37.209 + case $SPDK_TEST_NVMF_NICS in 00:01:37.209 + DRIVERS= 00:01:37.209 + [[ -n '' ]] 00:01:37.209 + exit 0 00:01:37.219 [Pipeline] } 00:01:37.233 [Pipeline] // withEnv 00:01:37.238 [Pipeline] } 00:01:37.250 [Pipeline] // stage 00:01:37.261 [Pipeline] catchError 00:01:37.263 [Pipeline] { 00:01:37.277 [Pipeline] timeout 00:01:37.277 Timeout set to expire in 30 min 00:01:37.279 [Pipeline] { 00:01:37.292 [Pipeline] stage 00:01:37.294 [Pipeline] { (Tests) 00:01:37.308 [Pipeline] sh 00:01:37.595 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:37.595 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:37.595 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:37.595 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:37.595 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:37.595 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:37.595 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:37.595 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:37.595 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:37.595 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:37.595 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:37.595 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:37.595 + source /etc/os-release 00:01:37.595 ++ NAME='Fedora Linux' 00:01:37.595 ++ VERSION='39 (Cloud Edition)' 00:01:37.595 ++ ID=fedora 00:01:37.595 ++ VERSION_ID=39 00:01:37.595 ++ VERSION_CODENAME= 00:01:37.595 ++ PLATFORM_ID=platform:f39 00:01:37.595 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:01:37.595 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:37.595 ++ LOGO=fedora-logo-icon 00:01:37.595 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:01:37.595 ++ HOME_URL=https://fedoraproject.org/ 00:01:37.595 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:01:37.595 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:37.595 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:37.596 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:37.596 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:01:37.596 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:37.596 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:01:37.596 ++ SUPPORT_END=2024-11-12 00:01:37.596 ++ VARIANT='Cloud Edition' 00:01:37.596 ++ VARIANT_ID=cloud 00:01:37.596 + uname -a 00:01:37.596 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:01:37.596 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:40.893 Hugepages 00:01:40.893 node hugesize free / total 00:01:40.893 node0 1048576kB 0 / 0 00:01:40.893 node0 2048kB 0 / 0 00:01:40.893 node1 1048576kB 0 / 0 00:01:40.893 node1 2048kB 0 / 0 00:01:40.893 00:01:40.893 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:40.893 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:40.893 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:40.893 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:40.893 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:40.893 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:40.893 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:40.893 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:40.893 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:40.893 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:40.893 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:40.893 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:40.893 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:40.893 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:40.893 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:40.893 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:40.893 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:40.893 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:40.893 + rm -f /tmp/spdk-ld-path 00:01:40.893 + source autorun-spdk.conf 00:01:40.893 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:40.893 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:40.893 ++ SPDK_TEST_FUZZER=1 00:01:40.893 ++ SPDK_TEST_SETUP=1 00:01:40.893 ++ SPDK_RUN_UBSAN=1 00:01:40.893 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:40.893 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:40.893 ++ RUN_NIGHTLY=1 00:01:40.893 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:40.893 + [[ -n '' ]] 00:01:40.893 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:40.893 + for M in /var/spdk/build-*-manifest.txt 00:01:40.893 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:01:40.893 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:40.893 + for M in /var/spdk/build-*-manifest.txt 00:01:40.893 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:40.893 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:40.893 + for M in /var/spdk/build-*-manifest.txt 00:01:40.893 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:40.893 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:40.893 ++ uname 00:01:40.893 + [[ Linux == \L\i\n\u\x ]] 00:01:40.893 + sudo dmesg -T 00:01:40.893 + sudo dmesg --clear 00:01:40.893 + dmesg_pid=848671 00:01:40.893 + [[ Fedora Linux == FreeBSD ]] 00:01:40.893 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:40.893 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:40.893 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:40.893 + [[ -x /usr/src/fio-static/fio ]] 00:01:40.893 + export FIO_BIN=/usr/src/fio-static/fio 00:01:40.893 + FIO_BIN=/usr/src/fio-static/fio 00:01:40.893 + sudo dmesg -Tw 00:01:40.893 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:40.893 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:40.893 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:40.893 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:40.893 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:40.893 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:40.893 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:40.893 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:40.894 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:40.894 Test configuration: 00:01:40.894 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:40.894 SPDK_TEST_FUZZER_SHORT=1 00:01:40.894 SPDK_TEST_FUZZER=1 00:01:40.894 SPDK_TEST_SETUP=1 00:01:40.894 SPDK_RUN_UBSAN=1 00:01:40.894 SPDK_TEST_NATIVE_DPDK=v23.11 00:01:40.894 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:40.894 RUN_NIGHTLY=1 08:13:53 -- common/autotest_common.sh@1680 -- $ [[ n == y ]] 00:01:40.894 08:13:53 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:40.894 08:13:53 -- scripts/common.sh@15 -- $ shopt -s extglob 00:01:40.894 08:13:53 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:40.894 08:13:53 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:40.894 08:13:53 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:40.894 08:13:53 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:40.894 08:13:53 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:40.894 08:13:53 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:40.894 08:13:53 -- paths/export.sh@5 -- $ export PATH 00:01:40.894 08:13:53 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:40.894 08:13:53 -- common/autobuild_common.sh@478 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:40.894 08:13:53 -- common/autobuild_common.sh@479 -- $ date +%s 00:01:40.894 08:13:53 -- common/autobuild_common.sh@479 -- $ mktemp -dt spdk_1731827633.XXXXXX 00:01:40.894 08:13:53 -- common/autobuild_common.sh@479 -- $ SPDK_WORKSPACE=/tmp/spdk_1731827633.2Dz6q7 00:01:40.894 08:13:53 -- common/autobuild_common.sh@481 -- $ [[ -n '' ]] 00:01:40.894 08:13:53 -- common/autobuild_common.sh@485 -- $ '[' -n v23.11 ']' 00:01:40.894 08:13:53 -- common/autobuild_common.sh@486 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:40.894 08:13:53 -- common/autobuild_common.sh@486 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:40.894 08:13:53 -- common/autobuild_common.sh@492 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:40.894 08:13:53 -- common/autobuild_common.sh@494 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:40.894 08:13:53 -- common/autobuild_common.sh@495 -- $ get_config_params 00:01:40.894 08:13:54 -- common/autotest_common.sh@407 -- $ xtrace_disable 00:01:40.894 08:13:54 -- common/autotest_common.sh@10 -- $ set +x 00:01:40.894 08:13:54 -- common/autobuild_common.sh@495 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:40.894 08:13:54 -- common/autobuild_common.sh@497 -- $ start_monitor_resources 00:01:40.894 08:13:54 -- pm/common@17 -- $ local monitor 00:01:40.894 08:13:54 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:40.894 08:13:54 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:40.894 08:13:54 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:40.894 08:13:54 -- pm/common@21 -- $ date +%s 00:01:41.154 08:13:54 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:41.154 08:13:54 -- pm/common@21 -- $ date +%s 00:01:41.154 08:13:54 -- pm/common@21 -- $ date +%s 00:01:41.154 08:13:54 -- pm/common@25 -- $ sleep 1 00:01:41.154 08:13:54 -- pm/common@21 -- $ date +%s 00:01:41.154 08:13:54 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1731827634 00:01:41.154 08:13:54 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1731827634 00:01:41.154 08:13:54 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1731827634 00:01:41.154 08:13:54 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1731827634 00:01:41.154 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1731827634_collect-cpu-temp.pm.log 00:01:41.154 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1731827634_collect-cpu-load.pm.log 00:01:41.154 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1731827634_collect-vmstat.pm.log 00:01:41.154 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1731827634_collect-bmc-pm.bmc.pm.log 00:01:42.095 08:13:55 -- common/autobuild_common.sh@498 -- $ trap stop_monitor_resources EXIT 00:01:42.095 08:13:55 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:42.095 08:13:55 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:42.095 08:13:55 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:42.095 08:13:55 -- spdk/autobuild.sh@16 -- $ date -u 00:01:42.095 Sun Nov 17 07:13:55 AM UTC 2024 00:01:42.095 08:13:55 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:42.095 v24.09-1-gb18e1bd62 00:01:42.095 08:13:55 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:42.095 08:13:55 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:42.095 08:13:55 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:42.095 08:13:55 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:01:42.095 08:13:55 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:01:42.095 08:13:55 -- common/autotest_common.sh@10 -- $ set +x 00:01:42.095 ************************************ 00:01:42.095 START TEST ubsan 00:01:42.095 ************************************ 00:01:42.095 08:13:55 ubsan -- common/autotest_common.sh@1125 -- $ echo 'using ubsan' 00:01:42.095 using ubsan 00:01:42.095 00:01:42.095 real 0m0.001s 00:01:42.095 user 0m0.000s 00:01:42.095 sys 0m0.000s 00:01:42.095 08:13:55 ubsan -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:01:42.095 08:13:55 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:42.095 ************************************ 00:01:42.095 END TEST ubsan 00:01:42.095 ************************************ 00:01:42.095 08:13:55 -- spdk/autobuild.sh@27 -- $ '[' -n v23.11 ']' 00:01:42.095 08:13:55 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:42.095 08:13:55 -- common/autobuild_common.sh@442 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:42.095 08:13:55 -- common/autotest_common.sh@1101 -- $ '[' 2 -le 1 ']' 00:01:42.095 08:13:55 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:01:42.095 08:13:55 -- common/autotest_common.sh@10 -- $ set +x 00:01:42.095 ************************************ 00:01:42.095 START TEST build_native_dpdk 00:01:42.095 ************************************ 00:01:42.095 08:13:55 build_native_dpdk -- common/autotest_common.sh@1125 -- $ _build_native_dpdk 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@51 -- $ local compiler 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:42.095 eeb0605f11 version: 23.11.0 00:01:42.095 238778122a doc: update release notes for 23.11 00:01:42.095 46aa6b3cfc doc: fix description of RSS features 00:01:42.095 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:01:42.095 7e421ae345 devtools: support skipping forbid rule check 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@87 -- $ dpdk_ver=23.11.0 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@168 -- $ uname -s 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@169 -- $ lt 23.11.0 21.11.0 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 23.11.0 '<' 21.11.0 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 23 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@353 -- $ local d=23 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@355 -- $ echo 23 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=23 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 21 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@353 -- $ local d=21 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@355 -- $ echo 21 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=21 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@367 -- $ return 1 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@173 -- $ patch -p1 00:01:42.095 patching file config/rte_config.h 00:01:42.095 Hunk #1 succeeded at 60 (offset 1 line). 00:01:42.095 08:13:55 build_native_dpdk -- common/autobuild_common.sh@176 -- $ lt 23.11.0 24.07.0 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 23.11.0 '<' 24.07.0 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:01:42.095 08:13:55 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:42.355 08:13:55 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 23 00:01:42.355 08:13:55 build_native_dpdk -- scripts/common.sh@353 -- $ local d=23 00:01:42.355 08:13:55 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:01:42.355 08:13:55 build_native_dpdk -- scripts/common.sh@355 -- $ echo 23 00:01:42.355 08:13:55 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=23 00:01:42.355 08:13:55 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:01:42.355 08:13:55 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:01:42.355 08:13:55 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:42.355 08:13:55 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@368 -- $ return 0 00:01:42.356 08:13:55 build_native_dpdk -- common/autobuild_common.sh@177 -- $ patch -p1 00:01:42.356 patching file lib/pcapng/rte_pcapng.c 00:01:42.356 08:13:55 build_native_dpdk -- common/autobuild_common.sh@179 -- $ ge 23.11.0 24.07.0 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@376 -- $ cmp_versions 23.11.0 '>=' 24.07.0 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=>=' 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@348 -- $ : 1 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 23 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@353 -- $ local d=23 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@355 -- $ echo 23 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=23 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:01:42.356 08:13:55 build_native_dpdk -- scripts/common.sh@368 -- $ return 1 00:01:42.356 08:13:55 build_native_dpdk -- common/autobuild_common.sh@183 -- $ dpdk_kmods=false 00:01:42.356 08:13:55 build_native_dpdk -- common/autobuild_common.sh@184 -- $ uname -s 00:01:42.356 08:13:55 build_native_dpdk -- common/autobuild_common.sh@184 -- $ '[' Linux = FreeBSD ']' 00:01:42.356 08:13:55 build_native_dpdk -- common/autobuild_common.sh@188 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:01:42.356 08:13:55 build_native_dpdk -- common/autobuild_common.sh@188 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:47.637 The Meson build system 00:01:47.637 Version: 1.5.0 00:01:47.637 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:47.637 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:01:47.637 Build type: native build 00:01:47.637 Program cat found: YES (/usr/bin/cat) 00:01:47.637 Project name: DPDK 00:01:47.637 Project version: 23.11.0 00:01:47.637 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:01:47.637 C linker for the host machine: gcc ld.bfd 2.40-14 00:01:47.637 Host machine cpu family: x86_64 00:01:47.637 Host machine cpu: x86_64 00:01:47.637 Message: ## Building in Developer Mode ## 00:01:47.637 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:47.638 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:01:47.638 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:01:47.638 Program python3 found: YES (/usr/bin/python3) 00:01:47.638 Program cat found: YES (/usr/bin/cat) 00:01:47.638 config/meson.build:113: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:01:47.638 Compiler for C supports arguments -march=native: YES 00:01:47.638 Checking for size of "void *" : 8 00:01:47.638 Checking for size of "void *" : 8 (cached) 00:01:47.638 Library m found: YES 00:01:47.638 Library numa found: YES 00:01:47.638 Has header "numaif.h" : YES 00:01:47.638 Library fdt found: NO 00:01:47.638 Library execinfo found: NO 00:01:47.638 Has header "execinfo.h" : YES 00:01:47.638 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:01:47.638 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:47.638 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:47.638 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:47.638 Run-time dependency openssl found: YES 3.1.1 00:01:47.638 Run-time dependency libpcap found: YES 1.10.4 00:01:47.638 Has header "pcap.h" with dependency libpcap: YES 00:01:47.638 Compiler for C supports arguments -Wcast-qual: YES 00:01:47.638 Compiler for C supports arguments -Wdeprecated: YES 00:01:47.638 Compiler for C supports arguments -Wformat: YES 00:01:47.638 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:47.638 Compiler for C supports arguments -Wformat-security: NO 00:01:47.638 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:47.638 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:47.638 Compiler for C supports arguments -Wnested-externs: YES 00:01:47.638 Compiler for C supports arguments -Wold-style-definition: YES 00:01:47.638 Compiler for C supports arguments -Wpointer-arith: YES 00:01:47.638 Compiler for C supports arguments -Wsign-compare: YES 00:01:47.638 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:47.638 Compiler for C supports arguments -Wundef: YES 00:01:47.638 Compiler for C supports arguments -Wwrite-strings: YES 00:01:47.638 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:47.638 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:47.638 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:47.638 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:47.638 Program objdump found: YES (/usr/bin/objdump) 00:01:47.638 Compiler for C supports arguments -mavx512f: YES 00:01:47.638 Checking if "AVX512 checking" compiles: YES 00:01:47.638 Fetching value of define "__SSE4_2__" : 1 00:01:47.638 Fetching value of define "__AES__" : 1 00:01:47.638 Fetching value of define "__AVX__" : 1 00:01:47.638 Fetching value of define "__AVX2__" : 1 00:01:47.638 Fetching value of define "__AVX512BW__" : 1 00:01:47.638 Fetching value of define "__AVX512CD__" : 1 00:01:47.638 Fetching value of define "__AVX512DQ__" : 1 00:01:47.638 Fetching value of define "__AVX512F__" : 1 00:01:47.638 Fetching value of define "__AVX512VL__" : 1 00:01:47.638 Fetching value of define "__PCLMUL__" : 1 00:01:47.638 Fetching value of define "__RDRND__" : 1 00:01:47.638 Fetching value of define "__RDSEED__" : 1 00:01:47.638 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:47.638 Fetching value of define "__znver1__" : (undefined) 00:01:47.638 Fetching value of define "__znver2__" : (undefined) 00:01:47.638 Fetching value of define "__znver3__" : (undefined) 00:01:47.638 Fetching value of define "__znver4__" : (undefined) 00:01:47.638 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:47.638 Message: lib/log: Defining dependency "log" 00:01:47.638 Message: lib/kvargs: Defining dependency "kvargs" 00:01:47.638 Message: lib/telemetry: Defining dependency "telemetry" 00:01:47.638 Checking for function "getentropy" : NO 00:01:47.638 Message: lib/eal: Defining dependency "eal" 00:01:47.638 Message: lib/ring: Defining dependency "ring" 00:01:47.638 Message: lib/rcu: Defining dependency "rcu" 00:01:47.638 Message: lib/mempool: Defining dependency "mempool" 00:01:47.638 Message: lib/mbuf: Defining dependency "mbuf" 00:01:47.638 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:47.638 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:47.638 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:47.638 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:47.638 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:47.638 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:47.638 Compiler for C supports arguments -mpclmul: YES 00:01:47.638 Compiler for C supports arguments -maes: YES 00:01:47.638 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:47.638 Compiler for C supports arguments -mavx512bw: YES 00:01:47.638 Compiler for C supports arguments -mavx512dq: YES 00:01:47.638 Compiler for C supports arguments -mavx512vl: YES 00:01:47.638 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:47.638 Compiler for C supports arguments -mavx2: YES 00:01:47.638 Compiler for C supports arguments -mavx: YES 00:01:47.638 Message: lib/net: Defining dependency "net" 00:01:47.638 Message: lib/meter: Defining dependency "meter" 00:01:47.638 Message: lib/ethdev: Defining dependency "ethdev" 00:01:47.638 Message: lib/pci: Defining dependency "pci" 00:01:47.638 Message: lib/cmdline: Defining dependency "cmdline" 00:01:47.638 Message: lib/metrics: Defining dependency "metrics" 00:01:47.638 Message: lib/hash: Defining dependency "hash" 00:01:47.638 Message: lib/timer: Defining dependency "timer" 00:01:47.638 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:47.638 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:47.638 Fetching value of define "__AVX512CD__" : 1 (cached) 00:01:47.638 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:47.638 Message: lib/acl: Defining dependency "acl" 00:01:47.638 Message: lib/bbdev: Defining dependency "bbdev" 00:01:47.638 Message: lib/bitratestats: Defining dependency "bitratestats" 00:01:47.638 Run-time dependency libelf found: YES 0.191 00:01:47.638 Message: lib/bpf: Defining dependency "bpf" 00:01:47.638 Message: lib/cfgfile: Defining dependency "cfgfile" 00:01:47.638 Message: lib/compressdev: Defining dependency "compressdev" 00:01:47.638 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:47.638 Message: lib/distributor: Defining dependency "distributor" 00:01:47.638 Message: lib/dmadev: Defining dependency "dmadev" 00:01:47.638 Message: lib/efd: Defining dependency "efd" 00:01:47.638 Message: lib/eventdev: Defining dependency "eventdev" 00:01:47.638 Message: lib/dispatcher: Defining dependency "dispatcher" 00:01:47.638 Message: lib/gpudev: Defining dependency "gpudev" 00:01:47.638 Message: lib/gro: Defining dependency "gro" 00:01:47.638 Message: lib/gso: Defining dependency "gso" 00:01:47.638 Message: lib/ip_frag: Defining dependency "ip_frag" 00:01:47.638 Message: lib/jobstats: Defining dependency "jobstats" 00:01:47.638 Message: lib/latencystats: Defining dependency "latencystats" 00:01:47.638 Message: lib/lpm: Defining dependency "lpm" 00:01:47.638 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:47.638 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:47.638 Fetching value of define "__AVX512IFMA__" : (undefined) 00:01:47.638 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:01:47.638 Message: lib/member: Defining dependency "member" 00:01:47.638 Message: lib/pcapng: Defining dependency "pcapng" 00:01:47.638 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:47.638 Message: lib/power: Defining dependency "power" 00:01:47.638 Message: lib/rawdev: Defining dependency "rawdev" 00:01:47.638 Message: lib/regexdev: Defining dependency "regexdev" 00:01:47.638 Message: lib/mldev: Defining dependency "mldev" 00:01:47.638 Message: lib/rib: Defining dependency "rib" 00:01:47.638 Message: lib/reorder: Defining dependency "reorder" 00:01:47.638 Message: lib/sched: Defining dependency "sched" 00:01:47.638 Message: lib/security: Defining dependency "security" 00:01:47.638 Message: lib/stack: Defining dependency "stack" 00:01:47.638 Has header "linux/userfaultfd.h" : YES 00:01:47.638 Has header "linux/vduse.h" : YES 00:01:47.638 Message: lib/vhost: Defining dependency "vhost" 00:01:47.638 Message: lib/ipsec: Defining dependency "ipsec" 00:01:47.638 Message: lib/pdcp: Defining dependency "pdcp" 00:01:47.638 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:47.638 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:47.638 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:47.638 Message: lib/fib: Defining dependency "fib" 00:01:47.638 Message: lib/port: Defining dependency "port" 00:01:47.638 Message: lib/pdump: Defining dependency "pdump" 00:01:47.638 Message: lib/table: Defining dependency "table" 00:01:47.638 Message: lib/pipeline: Defining dependency "pipeline" 00:01:47.638 Message: lib/graph: Defining dependency "graph" 00:01:47.638 Message: lib/node: Defining dependency "node" 00:01:47.638 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:48.588 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:48.589 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:48.589 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:48.589 Compiler for C supports arguments -Wno-sign-compare: YES 00:01:48.589 Compiler for C supports arguments -Wno-unused-value: YES 00:01:48.589 Compiler for C supports arguments -Wno-format: YES 00:01:48.589 Compiler for C supports arguments -Wno-format-security: YES 00:01:48.589 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:01:48.589 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:01:48.589 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:01:48.589 Compiler for C supports arguments -Wno-unused-parameter: YES 00:01:48.589 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:48.589 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:48.589 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:48.589 Compiler for C supports arguments -mavx512bw: YES (cached) 00:01:48.589 Compiler for C supports arguments -march=skylake-avx512: YES 00:01:48.589 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:01:48.589 Has header "sys/epoll.h" : YES 00:01:48.589 Program doxygen found: YES (/usr/local/bin/doxygen) 00:01:48.589 Configuring doxy-api-html.conf using configuration 00:01:48.589 Configuring doxy-api-man.conf using configuration 00:01:48.589 Program mandb found: YES (/usr/bin/mandb) 00:01:48.589 Program sphinx-build found: NO 00:01:48.589 Configuring rte_build_config.h using configuration 00:01:48.589 Message: 00:01:48.589 ================= 00:01:48.589 Applications Enabled 00:01:48.589 ================= 00:01:48.589 00:01:48.589 apps: 00:01:48.589 dumpcap, graph, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, 00:01:48.589 test-crypto-perf, test-dma-perf, test-eventdev, test-fib, test-flow-perf, test-gpudev, test-mldev, test-pipeline, 00:01:48.589 test-pmd, test-regex, test-sad, test-security-perf, 00:01:48.589 00:01:48.589 Message: 00:01:48.589 ================= 00:01:48.589 Libraries Enabled 00:01:48.589 ================= 00:01:48.589 00:01:48.589 libs: 00:01:48.589 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:48.589 net, meter, ethdev, pci, cmdline, metrics, hash, timer, 00:01:48.589 acl, bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, 00:01:48.589 dmadev, efd, eventdev, dispatcher, gpudev, gro, gso, ip_frag, 00:01:48.589 jobstats, latencystats, lpm, member, pcapng, power, rawdev, regexdev, 00:01:48.589 mldev, rib, reorder, sched, security, stack, vhost, ipsec, 00:01:48.589 pdcp, fib, port, pdump, table, pipeline, graph, node, 00:01:48.589 00:01:48.589 00:01:48.589 Message: 00:01:48.589 =============== 00:01:48.589 Drivers Enabled 00:01:48.589 =============== 00:01:48.589 00:01:48.589 common: 00:01:48.589 00:01:48.589 bus: 00:01:48.589 pci, vdev, 00:01:48.589 mempool: 00:01:48.589 ring, 00:01:48.589 dma: 00:01:48.589 00:01:48.589 net: 00:01:48.589 i40e, 00:01:48.589 raw: 00:01:48.589 00:01:48.589 crypto: 00:01:48.589 00:01:48.589 compress: 00:01:48.589 00:01:48.589 regex: 00:01:48.589 00:01:48.589 ml: 00:01:48.589 00:01:48.589 vdpa: 00:01:48.589 00:01:48.589 event: 00:01:48.589 00:01:48.589 baseband: 00:01:48.589 00:01:48.589 gpu: 00:01:48.589 00:01:48.589 00:01:48.589 Message: 00:01:48.589 ================= 00:01:48.589 Content Skipped 00:01:48.589 ================= 00:01:48.589 00:01:48.589 apps: 00:01:48.589 00:01:48.589 libs: 00:01:48.589 00:01:48.589 drivers: 00:01:48.589 common/cpt: not in enabled drivers build config 00:01:48.589 common/dpaax: not in enabled drivers build config 00:01:48.589 common/iavf: not in enabled drivers build config 00:01:48.589 common/idpf: not in enabled drivers build config 00:01:48.589 common/mvep: not in enabled drivers build config 00:01:48.589 common/octeontx: not in enabled drivers build config 00:01:48.589 bus/auxiliary: not in enabled drivers build config 00:01:48.589 bus/cdx: not in enabled drivers build config 00:01:48.589 bus/dpaa: not in enabled drivers build config 00:01:48.589 bus/fslmc: not in enabled drivers build config 00:01:48.589 bus/ifpga: not in enabled drivers build config 00:01:48.589 bus/platform: not in enabled drivers build config 00:01:48.589 bus/vmbus: not in enabled drivers build config 00:01:48.589 common/cnxk: not in enabled drivers build config 00:01:48.589 common/mlx5: not in enabled drivers build config 00:01:48.589 common/nfp: not in enabled drivers build config 00:01:48.589 common/qat: not in enabled drivers build config 00:01:48.589 common/sfc_efx: not in enabled drivers build config 00:01:48.589 mempool/bucket: not in enabled drivers build config 00:01:48.589 mempool/cnxk: not in enabled drivers build config 00:01:48.589 mempool/dpaa: not in enabled drivers build config 00:01:48.589 mempool/dpaa2: not in enabled drivers build config 00:01:48.589 mempool/octeontx: not in enabled drivers build config 00:01:48.589 mempool/stack: not in enabled drivers build config 00:01:48.589 dma/cnxk: not in enabled drivers build config 00:01:48.589 dma/dpaa: not in enabled drivers build config 00:01:48.589 dma/dpaa2: not in enabled drivers build config 00:01:48.589 dma/hisilicon: not in enabled drivers build config 00:01:48.589 dma/idxd: not in enabled drivers build config 00:01:48.589 dma/ioat: not in enabled drivers build config 00:01:48.589 dma/skeleton: not in enabled drivers build config 00:01:48.589 net/af_packet: not in enabled drivers build config 00:01:48.589 net/af_xdp: not in enabled drivers build config 00:01:48.589 net/ark: not in enabled drivers build config 00:01:48.589 net/atlantic: not in enabled drivers build config 00:01:48.589 net/avp: not in enabled drivers build config 00:01:48.589 net/axgbe: not in enabled drivers build config 00:01:48.589 net/bnx2x: not in enabled drivers build config 00:01:48.589 net/bnxt: not in enabled drivers build config 00:01:48.589 net/bonding: not in enabled drivers build config 00:01:48.589 net/cnxk: not in enabled drivers build config 00:01:48.589 net/cpfl: not in enabled drivers build config 00:01:48.589 net/cxgbe: not in enabled drivers build config 00:01:48.589 net/dpaa: not in enabled drivers build config 00:01:48.589 net/dpaa2: not in enabled drivers build config 00:01:48.589 net/e1000: not in enabled drivers build config 00:01:48.589 net/ena: not in enabled drivers build config 00:01:48.589 net/enetc: not in enabled drivers build config 00:01:48.589 net/enetfec: not in enabled drivers build config 00:01:48.589 net/enic: not in enabled drivers build config 00:01:48.589 net/failsafe: not in enabled drivers build config 00:01:48.589 net/fm10k: not in enabled drivers build config 00:01:48.589 net/gve: not in enabled drivers build config 00:01:48.589 net/hinic: not in enabled drivers build config 00:01:48.589 net/hns3: not in enabled drivers build config 00:01:48.589 net/iavf: not in enabled drivers build config 00:01:48.589 net/ice: not in enabled drivers build config 00:01:48.589 net/idpf: not in enabled drivers build config 00:01:48.589 net/igc: not in enabled drivers build config 00:01:48.589 net/ionic: not in enabled drivers build config 00:01:48.589 net/ipn3ke: not in enabled drivers build config 00:01:48.589 net/ixgbe: not in enabled drivers build config 00:01:48.589 net/mana: not in enabled drivers build config 00:01:48.589 net/memif: not in enabled drivers build config 00:01:48.589 net/mlx4: not in enabled drivers build config 00:01:48.589 net/mlx5: not in enabled drivers build config 00:01:48.589 net/mvneta: not in enabled drivers build config 00:01:48.589 net/mvpp2: not in enabled drivers build config 00:01:48.589 net/netvsc: not in enabled drivers build config 00:01:48.589 net/nfb: not in enabled drivers build config 00:01:48.589 net/nfp: not in enabled drivers build config 00:01:48.589 net/ngbe: not in enabled drivers build config 00:01:48.589 net/null: not in enabled drivers build config 00:01:48.589 net/octeontx: not in enabled drivers build config 00:01:48.589 net/octeon_ep: not in enabled drivers build config 00:01:48.589 net/pcap: not in enabled drivers build config 00:01:48.589 net/pfe: not in enabled drivers build config 00:01:48.589 net/qede: not in enabled drivers build config 00:01:48.589 net/ring: not in enabled drivers build config 00:01:48.589 net/sfc: not in enabled drivers build config 00:01:48.589 net/softnic: not in enabled drivers build config 00:01:48.589 net/tap: not in enabled drivers build config 00:01:48.589 net/thunderx: not in enabled drivers build config 00:01:48.589 net/txgbe: not in enabled drivers build config 00:01:48.589 net/vdev_netvsc: not in enabled drivers build config 00:01:48.589 net/vhost: not in enabled drivers build config 00:01:48.589 net/virtio: not in enabled drivers build config 00:01:48.589 net/vmxnet3: not in enabled drivers build config 00:01:48.589 raw/cnxk_bphy: not in enabled drivers build config 00:01:48.589 raw/cnxk_gpio: not in enabled drivers build config 00:01:48.589 raw/dpaa2_cmdif: not in enabled drivers build config 00:01:48.589 raw/ifpga: not in enabled drivers build config 00:01:48.589 raw/ntb: not in enabled drivers build config 00:01:48.589 raw/skeleton: not in enabled drivers build config 00:01:48.589 crypto/armv8: not in enabled drivers build config 00:01:48.589 crypto/bcmfs: not in enabled drivers build config 00:01:48.589 crypto/caam_jr: not in enabled drivers build config 00:01:48.589 crypto/ccp: not in enabled drivers build config 00:01:48.589 crypto/cnxk: not in enabled drivers build config 00:01:48.589 crypto/dpaa_sec: not in enabled drivers build config 00:01:48.589 crypto/dpaa2_sec: not in enabled drivers build config 00:01:48.589 crypto/ipsec_mb: not in enabled drivers build config 00:01:48.589 crypto/mlx5: not in enabled drivers build config 00:01:48.589 crypto/mvsam: not in enabled drivers build config 00:01:48.589 crypto/nitrox: not in enabled drivers build config 00:01:48.589 crypto/null: not in enabled drivers build config 00:01:48.590 crypto/octeontx: not in enabled drivers build config 00:01:48.590 crypto/openssl: not in enabled drivers build config 00:01:48.590 crypto/scheduler: not in enabled drivers build config 00:01:48.590 crypto/uadk: not in enabled drivers build config 00:01:48.590 crypto/virtio: not in enabled drivers build config 00:01:48.590 compress/isal: not in enabled drivers build config 00:01:48.590 compress/mlx5: not in enabled drivers build config 00:01:48.590 compress/octeontx: not in enabled drivers build config 00:01:48.590 compress/zlib: not in enabled drivers build config 00:01:48.590 regex/mlx5: not in enabled drivers build config 00:01:48.590 regex/cn9k: not in enabled drivers build config 00:01:48.590 ml/cnxk: not in enabled drivers build config 00:01:48.590 vdpa/ifc: not in enabled drivers build config 00:01:48.590 vdpa/mlx5: not in enabled drivers build config 00:01:48.590 vdpa/nfp: not in enabled drivers build config 00:01:48.590 vdpa/sfc: not in enabled drivers build config 00:01:48.590 event/cnxk: not in enabled drivers build config 00:01:48.590 event/dlb2: not in enabled drivers build config 00:01:48.590 event/dpaa: not in enabled drivers build config 00:01:48.590 event/dpaa2: not in enabled drivers build config 00:01:48.590 event/dsw: not in enabled drivers build config 00:01:48.590 event/opdl: not in enabled drivers build config 00:01:48.590 event/skeleton: not in enabled drivers build config 00:01:48.590 event/sw: not in enabled drivers build config 00:01:48.590 event/octeontx: not in enabled drivers build config 00:01:48.590 baseband/acc: not in enabled drivers build config 00:01:48.590 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:01:48.590 baseband/fpga_lte_fec: not in enabled drivers build config 00:01:48.590 baseband/la12xx: not in enabled drivers build config 00:01:48.590 baseband/null: not in enabled drivers build config 00:01:48.590 baseband/turbo_sw: not in enabled drivers build config 00:01:48.590 gpu/cuda: not in enabled drivers build config 00:01:48.590 00:01:48.590 00:01:48.590 Build targets in project: 217 00:01:48.590 00:01:48.590 DPDK 23.11.0 00:01:48.590 00:01:48.590 User defined options 00:01:48.590 libdir : lib 00:01:48.590 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:48.590 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:01:48.590 c_link_args : 00:01:48.590 enable_docs : false 00:01:48.590 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:48.590 enable_kmods : false 00:01:48.590 machine : native 00:01:48.590 tests : false 00:01:48.590 00:01:48.590 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:48.590 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:01:48.590 08:14:01 build_native_dpdk -- common/autobuild_common.sh@192 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:01:48.590 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:48.590 [1/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:48.855 [2/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:48.855 [3/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:48.855 [4/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:48.855 [5/707] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:48.855 [6/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:48.855 [7/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:48.855 [8/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:48.855 [9/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:48.855 [10/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:48.855 [11/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:48.855 [12/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:48.855 [13/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:48.855 [14/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:48.855 [15/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:48.855 [16/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:48.855 [17/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:48.855 [18/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:48.855 [19/707] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:48.855 [20/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:48.855 [21/707] Linking static target lib/librte_kvargs.a 00:01:49.114 [22/707] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:49.114 [23/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:49.114 [24/707] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:49.114 [25/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:49.114 [26/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:49.114 [27/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:49.114 [28/707] Linking static target lib/librte_pci.a 00:01:49.114 [29/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:49.114 [30/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:49.114 [31/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:49.114 [32/707] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:49.114 [33/707] Linking static target lib/librte_log.a 00:01:49.114 [34/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:49.114 [35/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:49.114 [36/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:49.374 [37/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:49.374 [38/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:49.374 [39/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:49.374 [40/707] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.374 [41/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:49.374 [42/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:49.374 [43/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:49.374 [44/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:49.374 [45/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:49.374 [46/707] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.374 [47/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:49.374 [48/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:49.374 [49/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:49.374 [50/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:49.374 [51/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:49.374 [52/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:49.374 [53/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:49.374 [54/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:49.374 [55/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:49.374 [56/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:49.374 [57/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:49.374 [58/707] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:49.374 [59/707] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:49.374 [60/707] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:49.374 [61/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:49.374 [62/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:49.374 [63/707] Linking static target lib/librte_meter.a 00:01:49.639 [64/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:49.639 [65/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:49.639 [66/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:49.639 [67/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:49.639 [68/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:49.639 [69/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:49.639 [70/707] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:01:49.639 [71/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:49.639 [72/707] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:49.639 [73/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:49.639 [74/707] Linking static target lib/librte_ring.a 00:01:49.639 [75/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:49.639 [76/707] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:49.639 [77/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:49.639 [78/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:49.639 [79/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:49.639 [80/707] Linking static target lib/librte_cmdline.a 00:01:49.639 [81/707] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:49.639 [82/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:49.639 [83/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:49.639 [84/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:49.639 [85/707] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:49.639 [86/707] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:49.639 [87/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:49.639 [88/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:01:49.639 [89/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:49.639 [90/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:49.639 [91/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:01:49.639 [92/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:49.639 [93/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:49.639 [94/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:49.640 [95/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:49.640 [96/707] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:49.640 [97/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:01:49.640 [98/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:01:49.640 [99/707] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:01:49.640 [100/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:01:49.640 [101/707] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:49.640 [102/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:49.640 [103/707] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:49.640 [104/707] Linking static target lib/librte_metrics.a 00:01:49.640 [105/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:49.640 [106/707] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:49.640 [107/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:49.640 [108/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:01:49.640 [109/707] Linking static target lib/librte_net.a 00:01:49.640 [110/707] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:01:49.640 [111/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:49.640 [112/707] Linking static target lib/librte_bitratestats.a 00:01:49.640 [113/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:49.640 [114/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:49.640 [115/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:49.900 [116/707] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:01:49.900 [117/707] Linking static target lib/librte_cfgfile.a 00:01:49.900 [118/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:49.900 [119/707] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:49.900 [120/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:49.900 [121/707] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:49.900 [122/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:49.900 [123/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:49.900 [124/707] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:49.900 [125/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:49.900 [126/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:49.900 [127/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:49.900 [128/707] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:01:49.900 [129/707] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.900 [130/707] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.900 [131/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:49.900 [132/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:49.900 [133/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:49.900 [134/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:01:49.900 [135/707] Linking target lib/librte_log.so.24.0 00:01:49.900 [136/707] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.900 [137/707] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:49.900 [138/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:01:49.900 [139/707] Linking static target lib/librte_timer.a 00:01:50.159 [140/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:50.159 [141/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:50.159 [142/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:50.159 [143/707] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.159 [144/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:01:50.159 [145/707] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:01:50.159 [146/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:01:50.159 [147/707] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.159 [148/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:50.159 [149/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:50.159 [150/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:50.159 [151/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:50.159 [152/707] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:01:50.159 [153/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:01:50.159 [154/707] Linking static target lib/librte_mempool.a 00:01:50.159 [155/707] Linking static target lib/librte_bbdev.a 00:01:50.159 [156/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:50.159 [157/707] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:50.159 [158/707] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:01:50.159 [159/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:50.159 [160/707] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:01:50.159 [161/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:01:50.159 [162/707] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:01:50.159 [163/707] Linking static target lib/librte_jobstats.a 00:01:50.159 [164/707] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:50.159 [165/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:50.159 [166/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:01:50.159 [167/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:01:50.159 [168/707] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:01:50.159 [169/707] Linking target lib/librte_kvargs.so.24.0 00:01:50.419 [170/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:50.419 [171/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:01:50.419 [172/707] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.419 [173/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:50.419 [174/707] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.419 [175/707] Linking static target lib/librte_compressdev.a 00:01:50.419 [176/707] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:50.419 [177/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:01:50.419 [178/707] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:01:50.419 [179/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:01:50.420 [180/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:01:50.420 [181/707] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:01:50.420 [182/707] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:50.420 [183/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:50.420 [184/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:01:50.420 [185/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils.c.o 00:01:50.420 [186/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp6.c.o 00:01:50.420 [187/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev_pmd.c.o 00:01:50.420 [188/707] Compiling C object lib/librte_dispatcher.a.p/dispatcher_rte_dispatcher.c.o 00:01:50.420 [189/707] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:01:50.420 [190/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:50.420 [191/707] Linking static target lib/librte_dispatcher.a 00:01:50.420 [192/707] Linking static target lib/member/libsketch_avx512_tmp.a 00:01:50.420 [193/707] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:01:50.420 [194/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:01:50.420 [195/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar_bfloat16.c.o 00:01:50.420 [196/707] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:50.420 [197/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:01:50.420 [198/707] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:01:50.420 [199/707] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:50.420 [200/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:50.420 [201/707] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:01:50.420 [202/707] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:01:50.420 [203/707] Linking static target lib/librte_latencystats.a 00:01:50.420 [204/707] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:50.420 [205/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:01:50.420 [206/707] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:01:50.420 [207/707] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:50.420 [208/707] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.420 [209/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:01:50.420 [210/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:01:50.681 [211/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:01:50.681 [212/707] Linking static target lib/librte_stack.a 00:01:50.681 [213/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:01:50.681 [214/707] Linking static target lib/librte_eal.a 00:01:50.681 [215/707] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:01:50.681 [216/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:01:50.681 [217/707] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:01:50.681 [218/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:01:50.681 [219/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:01:50.681 [220/707] Linking static target lib/librte_gpudev.a 00:01:50.682 [221/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:50.682 [222/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:50.682 [223/707] Linking static target lib/librte_gro.a 00:01:50.682 [224/707] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:50.682 [225/707] Linking static target lib/librte_telemetry.a 00:01:50.682 [226/707] Linking static target lib/librte_dmadev.a 00:01:50.682 [227/707] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:01:50.682 [228/707] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:50.682 [229/707] Linking static target lib/librte_rcu.a 00:01:50.682 [230/707] Linking static target lib/librte_regexdev.a 00:01:50.682 [231/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:01:50.682 [232/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:01:50.682 [233/707] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:01:50.682 [234/707] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:01:50.682 [235/707] Linking static target lib/librte_rawdev.a 00:01:50.682 [236/707] Linking static target lib/librte_gso.a 00:01:50.682 [237/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:01:50.682 [238/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev.c.o 00:01:50.682 [239/707] Linking static target lib/librte_distributor.a 00:01:50.682 [240/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar.c.o 00:01:50.682 [241/707] Linking static target lib/librte_mldev.a 00:01:50.682 [242/707] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:01:50.682 [243/707] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:50.682 [244/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:01:50.682 [245/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:50.682 [246/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:01:50.682 [247/707] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.682 [248/707] Linking static target lib/librte_power.a 00:01:50.682 [249/707] Linking static target lib/librte_mbuf.a 00:01:50.682 [250/707] Linking static target lib/librte_ip_frag.a 00:01:50.946 [251/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:01:50.946 [252/707] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:01:50.946 [253/707] Linking static target lib/librte_pcapng.a 00:01:50.946 [254/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_reorder.c.o 00:01:50.946 [255/707] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:50.946 [256/707] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:50.946 [257/707] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.946 [258/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_dma_adapter.c.o 00:01:50.946 [259/707] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.946 [260/707] Linking static target lib/librte_reorder.a 00:01:50.946 [261/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:01:50.946 [262/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:01:50.946 [263/707] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.946 [264/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_crypto.c.o 00:01:50.946 [265/707] Linking static target lib/librte_bpf.a 00:01:50.946 [266/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_ctrl_pdu.c.o 00:01:50.946 [267/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:01:50.946 [268/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:01:50.946 [269/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:50.946 [270/707] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:50.946 [271/707] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:50.946 [272/707] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.946 [273/707] Linking static target lib/librte_security.a 00:01:50.946 [274/707] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.946 [275/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:50.946 [276/707] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:50.946 [277/707] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:01:51.207 [278/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:01:51.207 [279/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_cnt.c.o 00:01:51.207 [280/707] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.207 [281/707] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.207 [282/707] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:01:51.207 [283/707] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.207 [284/707] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:01:51.207 [285/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:01:51.207 [286/707] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:01:51.207 [287/707] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.207 [288/707] Linking static target lib/librte_lpm.a 00:01:51.207 [289/707] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.207 [290/707] Compiling C object lib/librte_node.a.p/node_null.c.o 00:01:51.207 [291/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:01:51.207 [292/707] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:51.207 [293/707] Generating lib/dispatcher.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.207 [294/707] Linking static target lib/librte_rib.a 00:01:51.207 [295/707] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:01:51.207 [296/707] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:01:51.207 [297/707] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.207 [298/707] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:01:51.207 [299/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:51.207 [300/707] Compiling C object lib/librte_pdcp.a.p/pdcp_rte_pdcp.c.o 00:01:51.207 [301/707] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.207 [302/707] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.471 [303/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:51.471 [304/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:01:51.471 [305/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:01:51.471 [306/707] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.471 [307/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:01:51.471 [308/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:01:51.471 [309/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:01:51.471 [310/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:01:51.471 [311/707] Linking target lib/librte_telemetry.so.24.0 00:01:51.471 [312/707] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.471 [313/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:01:51.471 [314/707] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:01:51.471 [315/707] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.471 [316/707] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.471 [317/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:01:51.471 [318/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:01:51.471 [319/707] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:01:51.471 [320/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:01:51.471 [321/707] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:01:51.471 [322/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:01:51.471 [323/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:01:51.471 [324/707] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:01:51.471 [325/707] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:01:51.471 [326/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:01:51.471 [327/707] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:01:51.471 [328/707] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:01:51.734 [329/707] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:01:51.734 [330/707] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:01:51.734 [331/707] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:01:51.734 [332/707] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:01:51.734 [333/707] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:01:51.734 [334/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_worker.c.o 00:01:51.734 [335/707] Linking static target lib/librte_efd.a 00:01:51.734 [336/707] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:01:51.734 [337/707] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.734 [338/707] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:01:51.734 [339/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:51.734 [340/707] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.734 [341/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:01:51.734 [342/707] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:01:51.734 [343/707] Compiling C object lib/librte_graph.a.p/graph_graph_pcap.c.o 00:01:51.734 [344/707] Compiling C object lib/librte_node.a.p/node_log.c.o 00:01:51.734 [345/707] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:01:51.734 [346/707] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:01:51.734 [347/707] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.734 [348/707] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:01:51.734 [349/707] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:01:51.734 [350/707] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.734 [351/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:01:51.734 [352/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:01:51.734 [353/707] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:01:51.734 [354/707] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:01:51.996 [355/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:51.996 [356/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:01:51.996 [357/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:01:51.996 [358/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:01:51.996 [359/707] Compiling C object lib/librte_node.a.p/node_kernel_tx.c.o 00:01:51.996 [360/707] Linking static target lib/librte_fib.a 00:01:51.996 [361/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:01:51.996 [362/707] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.996 [363/707] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.996 [364/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:01:51.996 [365/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:01:51.996 [366/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:51.996 [367/707] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:01:51.996 [368/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:01:51.996 [369/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:51.996 [370/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:51.996 [371/707] Compiling C object lib/librte_node.a.p/node_ip4_reassembly.c.o 00:01:51.996 [372/707] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.996 [373/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:01:51.996 [374/707] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.996 [375/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:51.996 [376/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:01:51.996 [377/707] Compiling C object lib/librte_node.a.p/node_ip4_local.c.o 00:01:51.996 [378/707] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:51.996 [379/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:52.255 [380/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:01:52.255 [381/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:01:52.255 [382/707] Compiling C object app/dpdk-graph.p/graph_cli.c.o 00:01:52.255 [383/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:01:52.255 [384/707] Compiling C object app/dpdk-graph.p/graph_ethdev_rx.c.o 00:01:52.255 [385/707] Compiling C object app/dpdk-graph.p/graph_conn.c.o 00:01:52.255 [386/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:52.255 [387/707] Compiling C object app/dpdk-graph.p/graph_ip4_route.c.o 00:01:52.255 [388/707] Compiling C object app/dpdk-graph.p/graph_ip6_route.c.o 00:01:52.255 [389/707] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:01:52.255 [390/707] Compiling C object app/dpdk-graph.p/graph_mempool.c.o 00:01:52.255 [391/707] Linking static target lib/librte_pdump.a 00:01:52.255 [392/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:01:52.255 [393/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:52.255 [394/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_model_mcore_dispatch.c.o 00:01:52.255 [395/707] Compiling C object lib/librte_node.a.p/node_udp4_input.c.o 00:01:52.255 [396/707] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:52.255 [397/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_test.c.o 00:01:52.255 [398/707] Linking static target lib/librte_graph.a 00:01:52.255 [399/707] Compiling C object app/dpdk-graph.p/graph_main.c.o 00:01:52.255 [400/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:01:52.255 [401/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:01:52.255 [402/707] Compiling C object app/dpdk-graph.p/graph_utils.c.o 00:01:52.255 [403/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:01:52.513 [404/707] Compiling C object app/dpdk-graph.p/graph_l3fwd.c.o 00:01:52.513 [405/707] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:01:52.513 [406/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:01:52.513 [407/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:01:52.513 [408/707] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:52.513 [409/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:01:52.513 [410/707] Compiling C object app/dpdk-graph.p/graph_graph.c.o 00:01:52.513 [411/707] Linking static target lib/librte_table.a 00:01:52.513 [412/707] Compiling C object app/dpdk-graph.p/graph_ethdev.c.o 00:01:52.513 [413/707] Compiling C object lib/librte_node.a.p/node_kernel_rx.c.o 00:01:52.513 [414/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_parser.c.o 00:01:52.513 [415/707] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:52.513 [416/707] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:52.513 [417/707] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:01:52.513 [418/707] Compiling C object lib/librte_node.a.p/node_ip6_lookup.c.o 00:01:52.513 [419/707] Linking static target drivers/librte_bus_vdev.a 00:01:52.513 [420/707] Linking static target lib/librte_sched.a 00:01:52.513 [421/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:01:52.513 [422/707] Compiling C object app/dpdk-graph.p/graph_neigh.c.o 00:01:52.513 [423/707] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.513 [424/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:01:52.513 [425/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:01:52.513 [426/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:01:52.513 [427/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_recycle_mbufs_vec_common.c.o 00:01:52.513 [428/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:01:52.513 [429/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:01:52.784 [430/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:52.784 [431/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:01:52.784 [432/707] Linking static target lib/librte_cryptodev.a 00:01:52.784 [433/707] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.784 [434/707] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:52.784 [435/707] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:01:52.784 [436/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:01:52.784 [437/707] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:01:52.784 [438/707] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:52.784 [439/707] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:52.784 [440/707] Linking static target drivers/librte_bus_pci.a 00:01:52.784 [441/707] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:01:52.784 [442/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_main.c.o 00:01:52.784 [443/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:01:52.784 [444/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ipsec.c.o 00:01:52.784 [445/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:01:52.784 [446/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_main.c.o 00:01:52.784 [447/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:01:52.784 [448/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:01:52.784 [449/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:01:53.053 [450/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:01:53.053 [451/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_common.c.o 00:01:53.053 [452/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_device_ops.c.o 00:01:53.053 [453/707] Linking static target lib/librte_ipsec.a 00:01:53.053 [454/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:01:53.053 [455/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:01:53.053 [456/707] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:53.053 [457/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_ordered.c.o 00:01:53.053 [458/707] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:53.053 [459/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_options.c.o 00:01:53.053 [460/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_interleave.c.o 00:01:53.053 [461/707] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.053 [462/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:01:53.053 [463/707] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:01:53.053 [464/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_stats.c.o 00:01:53.053 [465/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_common.c.o 00:01:53.053 [466/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:01:53.053 [467/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:01:53.053 [468/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_ops.c.o 00:01:53.053 [469/707] Generating lib/mldev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.053 [470/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:01:53.054 [471/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:01:53.054 [472/707] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:01:53.054 [473/707] Linking static target lib/librte_member.a 00:01:53.054 [474/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:01:53.054 [475/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_process.c.o 00:01:53.054 [476/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:01:53.054 [477/707] Compiling C object lib/librte_node.a.p/node_ip6_rewrite.c.o 00:01:53.054 [478/707] Linking static target lib/librte_pdcp.a 00:01:53.054 [479/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:01:53.054 [480/707] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.054 [481/707] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:01:53.054 [482/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:01:53.054 [483/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:01:53.054 [484/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:01:53.054 [485/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:01:53.054 [486/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:01:53.054 [487/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:01:53.054 [488/707] Linking static target lib/librte_node.a 00:01:53.314 [489/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:01:53.314 [490/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:01:53.314 [491/707] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:53.314 [492/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:01:53.314 [493/707] Linking static target lib/librte_hash.a 00:01:53.314 [494/707] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:53.314 [495/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_benchmark.c.o 00:01:53.314 [496/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:01:53.314 [497/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:01:53.314 [498/707] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:53.314 [499/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:01:53.314 [500/707] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:53.314 [501/707] Linking static target drivers/librte_mempool_ring.a 00:01:53.314 [502/707] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.314 [503/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:01:53.314 [504/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_cman.c.o 00:01:53.314 [505/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:01:53.314 [506/707] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.314 [507/707] Compiling C object lib/acl/libavx2_tmp.a.p/acl_run_avx2.c.o 00:01:53.314 [508/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:01:53.314 [509/707] Linking static target lib/acl/libavx2_tmp.a 00:01:53.314 [510/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:01:53.314 [511/707] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.314 [512/707] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:01:53.314 [513/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:01:53.314 [514/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:01:53.314 [515/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:01:53.314 [516/707] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.314 [517/707] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:01:53.314 [518/707] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:01:53.314 [519/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:01:53.314 [520/707] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:01:53.314 [521/707] Linking static target lib/librte_port.a 00:01:53.574 [522/707] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:01:53.574 [523/707] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:01:53.574 [524/707] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:01:53.574 [525/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:01:53.574 [526/707] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.574 [527/707] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:01:53.574 [528/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:01:53.574 [529/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:01:53.574 [530/707] Generating lib/pdcp.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.574 [531/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:01:53.574 [532/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:01:53.574 [533/707] Compiling C object app/dpdk-testpmd.p/test-pmd_recycle_mbufs.c.o 00:01:53.574 [534/707] Linking static target lib/librte_eventdev.a 00:01:53.574 [535/707] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.574 [536/707] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:01:53.574 [537/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:01:53.574 [538/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:01:53.574 [539/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:01:53.574 [540/707] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:01:53.574 [541/707] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:01:53.574 [542/707] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:01:53.574 [543/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:01:53.574 [544/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:01:53.574 [545/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:01:53.833 [546/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:01:53.833 [547/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:01:53.833 [548/707] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:01:53.833 [549/707] Linking static target lib/librte_acl.a 00:01:53.833 [550/707] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:01:53.833 [551/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:01:53.833 [552/707] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:01:53.833 [553/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:01:53.833 [554/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:01:53.833 [555/707] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:01:53.833 [556/707] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:01:53.833 [557/707] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:01:53.833 [558/707] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:01:53.833 [559/707] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.833 [560/707] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:01:54.092 [561/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:01:54.092 [562/707] Compiling C object drivers/net/i40e/libi40e_avx2_lib.a.p/i40e_rxtx_vec_avx2.c.o 00:01:54.092 [563/707] Linking static target drivers/net/i40e/libi40e_avx2_lib.a 00:01:54.092 [564/707] Linking static target drivers/net/i40e/base/libi40e_base.a 00:01:54.092 [565/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:01:54.092 [566/707] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:01:54.092 [567/707] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.092 [568/707] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:01:54.351 [569/707] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.351 [570/707] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:01:54.351 [571/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_common.c.o 00:01:54.351 [572/707] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.610 [573/707] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:01:54.610 [574/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:54.610 [575/707] Linking static target lib/librte_ethdev.a 00:01:54.869 [576/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:01:54.869 [577/707] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:01:55.129 [578/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:01:55.129 [579/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:01:55.387 [580/707] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:01:55.956 [581/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:01:55.956 [582/707] Linking static target drivers/libtmp_rte_net_i40e.a 00:01:55.956 [583/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:01:56.220 [584/707] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:01:56.220 [585/707] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:56.220 [586/707] Compiling C object drivers/librte_net_i40e.so.24.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:56.220 [587/707] Linking static target drivers/librte_net_i40e.a 00:01:56.489 [588/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:57.056 [589/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:01:57.056 [590/707] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.314 [591/707] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.884 [592/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:02:03.166 [593/707] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.166 [594/707] Linking target lib/librte_eal.so.24.0 00:02:03.166 [595/707] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:03.166 [596/707] Linking target lib/librte_timer.so.24.0 00:02:03.166 [597/707] Linking target lib/librte_jobstats.so.24.0 00:02:03.166 [598/707] Linking target lib/librte_dmadev.so.24.0 00:02:03.166 [599/707] Linking target lib/librte_acl.so.24.0 00:02:03.166 [600/707] Linking target lib/librte_ring.so.24.0 00:02:03.166 [601/707] Linking target lib/librte_pci.so.24.0 00:02:03.166 [602/707] Linking target lib/librte_stack.so.24.0 00:02:03.166 [603/707] Linking target lib/librte_rawdev.so.24.0 00:02:03.166 [604/707] Linking target lib/librte_meter.so.24.0 00:02:03.166 [605/707] Linking target lib/librte_cfgfile.so.24.0 00:02:03.166 [606/707] Linking target drivers/librte_bus_vdev.so.24.0 00:02:03.166 [607/707] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:03.166 [608/707] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:03.166 [609/707] Generating symbol file drivers/librte_bus_vdev.so.24.0.p/librte_bus_vdev.so.24.0.symbols 00:02:03.166 [610/707] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:03.166 [611/707] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:03.166 [612/707] Generating symbol file lib/librte_acl.so.24.0.p/librte_acl.so.24.0.symbols 00:02:03.166 [613/707] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:03.166 [614/707] Linking target drivers/librte_bus_pci.so.24.0 00:02:03.166 [615/707] Linking target lib/librte_mempool.so.24.0 00:02:03.166 [616/707] Linking target lib/librte_rcu.so.24.0 00:02:03.167 [617/707] Generating symbol file drivers/librte_bus_pci.so.24.0.p/librte_bus_pci.so.24.0.symbols 00:02:03.167 [618/707] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:03.167 [619/707] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:03.167 [620/707] Linking target lib/librte_mbuf.so.24.0 00:02:03.167 [621/707] Linking target drivers/librte_mempool_ring.so.24.0 00:02:03.167 [622/707] Linking target lib/librte_rib.so.24.0 00:02:03.167 [623/707] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.167 [624/707] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:03.167 [625/707] Generating symbol file lib/librte_rib.so.24.0.p/librte_rib.so.24.0.symbols 00:02:03.167 [626/707] Linking target lib/librte_distributor.so.24.0 00:02:03.167 [627/707] Linking target lib/librte_net.so.24.0 00:02:03.167 [628/707] Linking target lib/librte_gpudev.so.24.0 00:02:03.167 [629/707] Linking target lib/librte_regexdev.so.24.0 00:02:03.167 [630/707] Linking target lib/librte_bbdev.so.24.0 00:02:03.167 [631/707] Linking target lib/librte_compressdev.so.24.0 00:02:03.167 [632/707] Linking target lib/librte_mldev.so.24.0 00:02:03.167 [633/707] Linking target lib/librte_sched.so.24.0 00:02:03.167 [634/707] Linking target lib/librte_reorder.so.24.0 00:02:03.167 [635/707] Linking target lib/librte_cryptodev.so.24.0 00:02:03.167 [636/707] Linking target lib/librte_fib.so.24.0 00:02:03.167 [637/707] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:03.167 [638/707] Generating symbol file lib/librte_reorder.so.24.0.p/librte_reorder.so.24.0.symbols 00:02:03.167 [639/707] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:03.167 [640/707] Generating symbol file lib/librte_sched.so.24.0.p/librte_sched.so.24.0.symbols 00:02:03.167 [641/707] Linking target lib/librte_hash.so.24.0 00:02:03.167 [642/707] Linking target lib/librte_security.so.24.0 00:02:03.426 [643/707] Linking target lib/librte_cmdline.so.24.0 00:02:03.426 [644/707] Linking target lib/librte_ethdev.so.24.0 00:02:03.426 [645/707] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:03.426 [646/707] Generating symbol file lib/librte_security.so.24.0.p/librte_security.so.24.0.symbols 00:02:03.426 [647/707] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:03.426 [648/707] Linking target lib/librte_member.so.24.0 00:02:03.426 [649/707] Linking target lib/librte_efd.so.24.0 00:02:03.426 [650/707] Linking target lib/librte_pdcp.so.24.0 00:02:03.426 [651/707] Linking target lib/librte_lpm.so.24.0 00:02:03.426 [652/707] Linking target lib/librte_ipsec.so.24.0 00:02:03.426 [653/707] Linking target lib/librte_power.so.24.0 00:02:03.426 [654/707] Linking target lib/librte_metrics.so.24.0 00:02:03.426 [655/707] Linking target lib/librte_ip_frag.so.24.0 00:02:03.426 [656/707] Linking target lib/librte_pcapng.so.24.0 00:02:03.426 [657/707] Linking target lib/librte_gro.so.24.0 00:02:03.426 [658/707] Linking target lib/librte_gso.so.24.0 00:02:03.426 [659/707] Linking target lib/librte_bpf.so.24.0 00:02:03.426 [660/707] Linking target lib/librte_eventdev.so.24.0 00:02:03.426 [661/707] Linking target drivers/librte_net_i40e.so.24.0 00:02:03.686 [662/707] Generating symbol file lib/librte_ipsec.so.24.0.p/librte_ipsec.so.24.0.symbols 00:02:03.686 [663/707] Generating symbol file lib/librte_lpm.so.24.0.p/librte_lpm.so.24.0.symbols 00:02:03.686 [664/707] Generating symbol file lib/librte_eventdev.so.24.0.p/librte_eventdev.so.24.0.symbols 00:02:03.686 [665/707] Generating symbol file lib/librte_ip_frag.so.24.0.p/librte_ip_frag.so.24.0.symbols 00:02:03.686 [666/707] Generating symbol file lib/librte_pcapng.so.24.0.p/librte_pcapng.so.24.0.symbols 00:02:03.686 [667/707] Generating symbol file lib/librte_bpf.so.24.0.p/librte_bpf.so.24.0.symbols 00:02:03.686 [668/707] Generating symbol file lib/librte_metrics.so.24.0.p/librte_metrics.so.24.0.symbols 00:02:03.686 [669/707] Linking target lib/librte_dispatcher.so.24.0 00:02:03.686 [670/707] Linking target lib/librte_graph.so.24.0 00:02:03.686 [671/707] Linking target lib/librte_pdump.so.24.0 00:02:03.686 [672/707] Linking target lib/librte_latencystats.so.24.0 00:02:03.686 [673/707] Linking target lib/librte_bitratestats.so.24.0 00:02:03.686 [674/707] Linking target lib/librte_port.so.24.0 00:02:03.686 [675/707] Generating symbol file lib/librte_graph.so.24.0.p/librte_graph.so.24.0.symbols 00:02:03.946 [676/707] Generating symbol file lib/librte_port.so.24.0.p/librte_port.so.24.0.symbols 00:02:03.946 [677/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:03.946 [678/707] Linking target lib/librte_node.so.24.0 00:02:03.946 [679/707] Linking static target lib/librte_pipeline.a 00:02:03.946 [680/707] Linking target lib/librte_table.so.24.0 00:02:03.946 [681/707] Generating symbol file lib/librte_table.so.24.0.p/librte_table.so.24.0.symbols 00:02:04.207 [682/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:04.207 [683/707] Linking static target lib/librte_vhost.a 00:02:04.773 [684/707] Linking target app/dpdk-graph 00:02:04.773 [685/707] Linking target app/dpdk-pdump 00:02:04.773 [686/707] Linking target app/dpdk-proc-info 00:02:04.773 [687/707] Linking target app/dpdk-test-bbdev 00:02:04.773 [688/707] Linking target app/dpdk-test-acl 00:02:04.773 [689/707] Linking target app/dpdk-test-dma-perf 00:02:04.773 [690/707] Linking target app/dpdk-test-sad 00:02:04.773 [691/707] Linking target app/dpdk-test-regex 00:02:04.773 [692/707] Linking target app/dpdk-test-crypto-perf 00:02:04.773 [693/707] Linking target app/dpdk-test-flow-perf 00:02:04.773 [694/707] Linking target app/dpdk-dumpcap 00:02:04.773 [695/707] Linking target app/dpdk-test-cmdline 00:02:04.773 [696/707] Linking target app/dpdk-test-gpudev 00:02:04.773 [697/707] Linking target app/dpdk-test-security-perf 00:02:04.773 [698/707] Linking target app/dpdk-test-mldev 00:02:04.773 [699/707] Linking target app/dpdk-test-pipeline 00:02:04.773 [700/707] Linking target app/dpdk-test-eventdev 00:02:04.773 [701/707] Linking target app/dpdk-test-fib 00:02:04.773 [702/707] Linking target app/dpdk-test-compress-perf 00:02:04.773 [703/707] Linking target app/dpdk-testpmd 00:02:06.675 [704/707] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.675 [705/707] Linking target lib/librte_vhost.so.24.0 00:02:09.976 [706/707] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.976 [707/707] Linking target lib/librte_pipeline.so.24.0 00:02:09.976 08:14:22 build_native_dpdk -- common/autobuild_common.sh@194 -- $ uname -s 00:02:09.976 08:14:22 build_native_dpdk -- common/autobuild_common.sh@194 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:09.976 08:14:22 build_native_dpdk -- common/autobuild_common.sh@207 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:02:09.976 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:09.976 [0/1] Installing files. 00:02:09.976 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.976 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.977 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:09.978 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.979 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec_sa.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:09.980 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:09.981 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:09.982 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:09.983 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:10.242 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:10.242 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:10.242 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:10.242 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:10.242 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:10.242 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:10.242 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:10.242 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:10.242 Installing lib/librte_log.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.242 Installing lib/librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.242 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_dispatcher.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.243 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.507 Installing lib/librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.507 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.507 Installing lib/librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.507 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.507 Installing lib/librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.507 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.507 Installing lib/librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.507 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.507 Installing lib/librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.507 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.507 Installing lib/librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.507 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.507 Installing lib/librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.507 Installing lib/librte_mldev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.507 Installing lib/librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.507 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.507 Installing lib/librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.507 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.507 Installing lib/librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_pdcp.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing lib/librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing drivers/librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:10.508 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing drivers/librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:10.508 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing drivers/librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:10.508 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.508 Installing drivers/librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:10.508 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing app/dpdk-graph to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing app/dpdk-test-dma-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing app/dpdk-test-mldev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/log/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.508 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lock_annotations.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_stdatomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.509 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_dtls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_pdcp_hdr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.510 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_dma_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dispatcher/rte_dispatcher.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.511 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_mcore_dispatch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_rtc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip6_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_udp4_input_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.512 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.513 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/dpdk-cmdline-gen.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.513 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.513 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.513 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.513 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.513 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-rss-flows.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:10.513 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.513 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:10.513 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:10.513 Installing symlink pointing to librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so.24 00:02:10.513 Installing symlink pointing to librte_log.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so 00:02:10.513 Installing symlink pointing to librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.24 00:02:10.513 Installing symlink pointing to librte_kvargs.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:10.513 Installing symlink pointing to librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.24 00:02:10.513 Installing symlink pointing to librte_telemetry.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:10.513 Installing symlink pointing to librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.24 00:02:10.513 Installing symlink pointing to librte_eal.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:10.513 Installing symlink pointing to librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.24 00:02:10.513 Installing symlink pointing to librte_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:10.513 Installing symlink pointing to librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.24 00:02:10.513 Installing symlink pointing to librte_rcu.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:10.513 Installing symlink pointing to librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.24 00:02:10.513 Installing symlink pointing to librte_mempool.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:10.513 Installing symlink pointing to librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.24 00:02:10.513 Installing symlink pointing to librte_mbuf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:10.513 Installing symlink pointing to librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.24 00:02:10.513 Installing symlink pointing to librte_net.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:10.513 Installing symlink pointing to librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.24 00:02:10.513 Installing symlink pointing to librte_meter.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:10.513 Installing symlink pointing to librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.24 00:02:10.513 Installing symlink pointing to librte_ethdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:10.513 Installing symlink pointing to librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.24 00:02:10.513 Installing symlink pointing to librte_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:10.513 Installing symlink pointing to librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.24 00:02:10.513 Installing symlink pointing to librte_cmdline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:10.513 Installing symlink pointing to librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.24 00:02:10.513 Installing symlink pointing to librte_metrics.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:10.513 Installing symlink pointing to librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.24 00:02:10.513 Installing symlink pointing to librte_hash.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:10.513 Installing symlink pointing to librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.24 00:02:10.513 Installing symlink pointing to librte_timer.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:10.513 Installing symlink pointing to librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.24 00:02:10.513 Installing symlink pointing to librte_acl.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:10.513 Installing symlink pointing to librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.24 00:02:10.513 Installing symlink pointing to librte_bbdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:10.513 Installing symlink pointing to librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.24 00:02:10.513 Installing symlink pointing to librte_bitratestats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:10.513 Installing symlink pointing to librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.24 00:02:10.513 Installing symlink pointing to librte_bpf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:10.513 Installing symlink pointing to librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.24 00:02:10.513 Installing symlink pointing to librte_cfgfile.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:10.513 Installing symlink pointing to librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.24 00:02:10.513 Installing symlink pointing to librte_compressdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:10.513 Installing symlink pointing to librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.24 00:02:10.513 Installing symlink pointing to librte_cryptodev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:10.513 Installing symlink pointing to librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.24 00:02:10.513 Installing symlink pointing to librte_distributor.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:10.513 Installing symlink pointing to librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.24 00:02:10.513 Installing symlink pointing to librte_dmadev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:10.513 Installing symlink pointing to librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.24 00:02:10.513 Installing symlink pointing to librte_efd.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:10.513 Installing symlink pointing to librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.24 00:02:10.513 Installing symlink pointing to librte_eventdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:10.513 Installing symlink pointing to librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so.24 00:02:10.513 Installing symlink pointing to librte_dispatcher.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so 00:02:10.513 Installing symlink pointing to librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.24 00:02:10.513 Installing symlink pointing to librte_gpudev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:10.513 Installing symlink pointing to librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.24 00:02:10.513 Installing symlink pointing to librte_gro.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:10.513 Installing symlink pointing to librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.24 00:02:10.513 Installing symlink pointing to librte_gso.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:10.513 Installing symlink pointing to librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.24 00:02:10.513 Installing symlink pointing to librte_ip_frag.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:10.513 Installing symlink pointing to librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.24 00:02:10.514 Installing symlink pointing to librte_jobstats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:10.514 Installing symlink pointing to librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.24 00:02:10.514 Installing symlink pointing to librte_latencystats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:10.514 Installing symlink pointing to librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.24 00:02:10.514 Installing symlink pointing to librte_lpm.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:10.514 Installing symlink pointing to librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.24 00:02:10.514 Installing symlink pointing to librte_member.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:10.514 Installing symlink pointing to librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.24 00:02:10.514 Installing symlink pointing to librte_pcapng.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:10.514 Installing symlink pointing to librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.24 00:02:10.514 Installing symlink pointing to librte_power.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:10.514 Installing symlink pointing to librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.24 00:02:10.514 Installing symlink pointing to librte_rawdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:10.514 Installing symlink pointing to librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.24 00:02:10.514 Installing symlink pointing to librte_regexdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:10.514 Installing symlink pointing to librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so.24 00:02:10.514 Installing symlink pointing to librte_mldev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so 00:02:10.514 Installing symlink pointing to librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.24 00:02:10.514 Installing symlink pointing to librte_rib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:10.514 Installing symlink pointing to librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.24 00:02:10.514 Installing symlink pointing to librte_reorder.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:10.514 Installing symlink pointing to librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.24 00:02:10.514 Installing symlink pointing to librte_sched.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:10.514 Installing symlink pointing to librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.24 00:02:10.514 Installing symlink pointing to librte_security.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:10.514 Installing symlink pointing to librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.24 00:02:10.514 Installing symlink pointing to librte_stack.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:10.514 Installing symlink pointing to librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.24 00:02:10.514 Installing symlink pointing to librte_vhost.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:10.514 Installing symlink pointing to librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.24 00:02:10.514 Installing symlink pointing to librte_ipsec.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:10.514 Installing symlink pointing to librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so.24 00:02:10.514 Installing symlink pointing to librte_pdcp.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so 00:02:10.514 Installing symlink pointing to librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.24 00:02:10.514 Installing symlink pointing to librte_fib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:10.514 Installing symlink pointing to librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.24 00:02:10.514 Installing symlink pointing to librte_port.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:10.514 Installing symlink pointing to librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.24 00:02:10.514 Installing symlink pointing to librte_pdump.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:10.514 Installing symlink pointing to librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.24 00:02:10.514 Installing symlink pointing to librte_table.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:10.514 Installing symlink pointing to librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.24 00:02:10.514 Installing symlink pointing to librte_pipeline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:10.514 Installing symlink pointing to librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.24 00:02:10.514 Installing symlink pointing to librte_graph.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:10.514 Installing symlink pointing to librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.24 00:02:10.514 './librte_bus_pci.so' -> 'dpdk/pmds-24.0/librte_bus_pci.so' 00:02:10.514 './librte_bus_pci.so.24' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24' 00:02:10.514 './librte_bus_pci.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24.0' 00:02:10.514 './librte_bus_vdev.so' -> 'dpdk/pmds-24.0/librte_bus_vdev.so' 00:02:10.514 './librte_bus_vdev.so.24' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24' 00:02:10.514 './librte_bus_vdev.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24.0' 00:02:10.514 './librte_mempool_ring.so' -> 'dpdk/pmds-24.0/librte_mempool_ring.so' 00:02:10.514 './librte_mempool_ring.so.24' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24' 00:02:10.514 './librte_mempool_ring.so.24.0' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24.0' 00:02:10.514 './librte_net_i40e.so' -> 'dpdk/pmds-24.0/librte_net_i40e.so' 00:02:10.514 './librte_net_i40e.so.24' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24' 00:02:10.514 './librte_net_i40e.so.24.0' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24.0' 00:02:10.514 Installing symlink pointing to librte_node.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:10.514 Installing symlink pointing to librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so.24 00:02:10.514 Installing symlink pointing to librte_bus_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so 00:02:10.514 Installing symlink pointing to librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so.24 00:02:10.514 Installing symlink pointing to librte_bus_vdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so 00:02:10.514 Installing symlink pointing to librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so.24 00:02:10.514 Installing symlink pointing to librte_mempool_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so 00:02:10.514 Installing symlink pointing to librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so.24 00:02:10.514 Installing symlink pointing to librte_net_i40e.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so 00:02:10.514 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-24.0' 00:02:10.514 08:14:23 build_native_dpdk -- common/autobuild_common.sh@213 -- $ cat 00:02:10.514 08:14:23 build_native_dpdk -- common/autobuild_common.sh@218 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:10.514 00:02:10.514 real 0m28.379s 00:02:10.514 user 8m2.721s 00:02:10.514 sys 2m30.971s 00:02:10.514 08:14:23 build_native_dpdk -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:02:10.514 08:14:23 build_native_dpdk -- common/autotest_common.sh@10 -- $ set +x 00:02:10.514 ************************************ 00:02:10.514 END TEST build_native_dpdk 00:02:10.514 ************************************ 00:02:10.514 08:14:23 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:10.514 08:14:23 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:10.514 08:14:23 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:10.514 08:14:23 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:10.514 08:14:23 -- common/autobuild_common.sh@438 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:10.514 08:14:23 -- common/autotest_common.sh@1101 -- $ '[' 2 -le 1 ']' 00:02:10.514 08:14:23 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:02:10.514 08:14:23 -- common/autotest_common.sh@10 -- $ set +x 00:02:10.514 ************************************ 00:02:10.514 START TEST autobuild_llvm_precompile 00:02:10.514 ************************************ 00:02:10.514 08:14:23 autobuild_llvm_precompile -- common/autotest_common.sh@1125 -- $ _llvm_precompile 00:02:10.514 08:14:23 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ clang --version 00:02:10.774 08:14:23 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:02:10.774 Target: x86_64-redhat-linux-gnu 00:02:10.774 Thread model: posix 00:02:10.774 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:10.774 08:14:23 autobuild_llvm_precompile -- common/autobuild_common.sh@33 -- $ clang_num=17 00:02:10.774 08:14:23 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:02:10.774 08:14:23 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:02:10.774 08:14:23 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:02:10.774 08:14:23 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:02:10.774 08:14:23 autobuild_llvm_precompile -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:02:10.774 08:14:23 autobuild_llvm_precompile -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:10.774 08:14:23 autobuild_llvm_precompile -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:02:10.774 08:14:23 autobuild_llvm_precompile -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:02:10.774 08:14:23 autobuild_llvm_precompile -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:10.774 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:11.033 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:11.033 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:11.033 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:11.293 Using 'verbs' RDMA provider 00:02:27.114 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:02:39.319 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:02:39.319 Creating mk/config.mk...done. 00:02:39.319 Creating mk/cc.flags.mk...done. 00:02:39.319 Type 'make' to build. 00:02:39.319 00:02:39.319 real 0m28.195s 00:02:39.319 user 0m12.511s 00:02:39.319 sys 0m14.875s 00:02:39.319 08:14:51 autobuild_llvm_precompile -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:02:39.319 08:14:51 autobuild_llvm_precompile -- common/autotest_common.sh@10 -- $ set +x 00:02:39.319 ************************************ 00:02:39.319 END TEST autobuild_llvm_precompile 00:02:39.319 ************************************ 00:02:39.319 08:14:51 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:39.319 08:14:51 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:39.319 08:14:51 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:39.319 08:14:51 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:39.319 08:14:51 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:39.319 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:39.319 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:39.319 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:39.319 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:39.886 Using 'verbs' RDMA provider 00:02:53.028 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:03.126 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:03.394 Creating mk/config.mk...done. 00:03:03.394 Creating mk/cc.flags.mk...done. 00:03:03.394 Type 'make' to build. 00:03:03.394 08:15:16 -- spdk/autobuild.sh@70 -- $ run_test make make -j112 00:03:03.394 08:15:16 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:03:03.394 08:15:16 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:03:03.394 08:15:16 -- common/autotest_common.sh@10 -- $ set +x 00:03:03.394 ************************************ 00:03:03.394 START TEST make 00:03:03.394 ************************************ 00:03:03.394 08:15:16 make -- common/autotest_common.sh@1125 -- $ make -j112 00:03:03.961 make[1]: Nothing to be done for 'all'. 00:03:05.339 The Meson build system 00:03:05.339 Version: 1.5.0 00:03:05.339 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:03:05.339 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:05.340 Build type: native build 00:03:05.340 Project name: libvfio-user 00:03:05.340 Project version: 0.0.1 00:03:05.340 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:03:05.340 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:03:05.340 Host machine cpu family: x86_64 00:03:05.340 Host machine cpu: x86_64 00:03:05.340 Run-time dependency threads found: YES 00:03:05.340 Library dl found: YES 00:03:05.340 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:03:05.340 Run-time dependency json-c found: YES 0.17 00:03:05.340 Run-time dependency cmocka found: YES 1.1.7 00:03:05.340 Program pytest-3 found: NO 00:03:05.340 Program flake8 found: NO 00:03:05.340 Program misspell-fixer found: NO 00:03:05.340 Program restructuredtext-lint found: NO 00:03:05.340 Program valgrind found: YES (/usr/bin/valgrind) 00:03:05.340 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:05.340 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:05.340 Compiler for C supports arguments -Wwrite-strings: YES 00:03:05.340 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:05.340 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:03:05.340 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:03:05.340 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:05.340 Build targets in project: 8 00:03:05.340 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:03:05.340 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:03:05.340 00:03:05.340 libvfio-user 0.0.1 00:03:05.340 00:03:05.340 User defined options 00:03:05.340 buildtype : debug 00:03:05.340 default_library: static 00:03:05.340 libdir : /usr/local/lib 00:03:05.340 00:03:05.340 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:05.907 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:05.907 [1/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:03:05.907 [2/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:03:05.907 [3/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:03:05.907 [4/36] Compiling C object samples/null.p/null.c.o 00:03:05.907 [5/36] Compiling C object samples/lspci.p/lspci.c.o 00:03:05.907 [6/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:03:05.907 [7/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:03:05.907 [8/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:03:05.907 [9/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:03:05.907 [10/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:03:05.907 [11/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:03:05.907 [12/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:03:05.907 [13/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:03:05.907 [14/36] Compiling C object samples/server.p/server.c.o 00:03:05.907 [15/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:03:05.907 [16/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:03:05.907 [17/36] Compiling C object test/unit_tests.p/mocks.c.o 00:03:05.907 [18/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:03:05.907 [19/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:03:05.907 [20/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:03:05.907 [21/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:03:05.907 [22/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:03:05.907 [23/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:03:05.907 [24/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:03:05.907 [25/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:03:05.907 [26/36] Compiling C object samples/client.p/client.c.o 00:03:05.907 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:03:05.907 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:03:05.907 [29/36] Linking target samples/client 00:03:05.907 [30/36] Linking static target lib/libvfio-user.a 00:03:06.166 [31/36] Linking target test/unit_tests 00:03:06.166 [32/36] Linking target samples/null 00:03:06.166 [33/36] Linking target samples/lspci 00:03:06.166 [34/36] Linking target samples/gpio-pci-idio-16 00:03:06.166 [35/36] Linking target samples/shadow_ioeventfd_server 00:03:06.166 [36/36] Linking target samples/server 00:03:06.166 INFO: autodetecting backend as ninja 00:03:06.166 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:06.166 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:06.425 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:06.425 ninja: no work to do. 00:03:18.628 CC lib/ut_mock/mock.o 00:03:18.628 CC lib/log/log_deprecated.o 00:03:18.628 CC lib/log/log.o 00:03:18.628 CC lib/log/log_flags.o 00:03:18.628 CC lib/ut/ut.o 00:03:18.887 LIB libspdk_log.a 00:03:18.887 LIB libspdk_ut_mock.a 00:03:18.887 LIB libspdk_ut.a 00:03:19.146 CC lib/ioat/ioat.o 00:03:19.146 CXX lib/trace_parser/trace.o 00:03:19.146 CC lib/util/base64.o 00:03:19.146 CC lib/util/bit_array.o 00:03:19.146 CC lib/util/cpuset.o 00:03:19.146 CC lib/util/crc16.o 00:03:19.146 CC lib/util/crc32.o 00:03:19.146 CC lib/util/crc32c.o 00:03:19.146 CC lib/util/crc32_ieee.o 00:03:19.146 CC lib/util/crc64.o 00:03:19.146 CC lib/util/dif.o 00:03:19.146 CC lib/util/fd.o 00:03:19.146 CC lib/util/fd_group.o 00:03:19.146 CC lib/util/file.o 00:03:19.146 CC lib/util/hexlify.o 00:03:19.146 CC lib/util/iov.o 00:03:19.146 CC lib/dma/dma.o 00:03:19.146 CC lib/util/math.o 00:03:19.146 CC lib/util/strerror_tls.o 00:03:19.146 CC lib/util/net.o 00:03:19.146 CC lib/util/pipe.o 00:03:19.146 CC lib/util/xor.o 00:03:19.146 CC lib/util/string.o 00:03:19.146 CC lib/util/uuid.o 00:03:19.146 CC lib/util/zipf.o 00:03:19.146 CC lib/util/md5.o 00:03:19.146 CC lib/vfio_user/host/vfio_user.o 00:03:19.146 CC lib/vfio_user/host/vfio_user_pci.o 00:03:19.146 LIB libspdk_ioat.a 00:03:19.146 LIB libspdk_dma.a 00:03:19.405 LIB libspdk_vfio_user.a 00:03:19.405 LIB libspdk_util.a 00:03:19.664 LIB libspdk_trace_parser.a 00:03:19.664 CC lib/env_dpdk/memory.o 00:03:19.664 CC lib/env_dpdk/env.o 00:03:19.664 CC lib/env_dpdk/threads.o 00:03:19.664 CC lib/env_dpdk/pci.o 00:03:19.664 CC lib/env_dpdk/init.o 00:03:19.664 CC lib/env_dpdk/pci_ioat.o 00:03:19.664 CC lib/rdma_provider/common.o 00:03:19.664 CC lib/env_dpdk/pci_idxd.o 00:03:19.664 CC lib/env_dpdk/pci_virtio.o 00:03:19.664 CC lib/rdma_provider/rdma_provider_verbs.o 00:03:19.664 CC lib/env_dpdk/pci_vmd.o 00:03:19.664 CC lib/env_dpdk/sigbus_handler.o 00:03:19.664 CC lib/env_dpdk/pci_event.o 00:03:19.664 CC lib/env_dpdk/pci_dpdk.o 00:03:19.664 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:19.664 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:19.664 CC lib/json/json_util.o 00:03:19.664 CC lib/json/json_write.o 00:03:19.664 CC lib/json/json_parse.o 00:03:19.664 CC lib/rdma_utils/rdma_utils.o 00:03:19.664 CC lib/vmd/led.o 00:03:19.664 CC lib/vmd/vmd.o 00:03:19.664 CC lib/idxd/idxd_user.o 00:03:19.664 CC lib/conf/conf.o 00:03:19.664 CC lib/idxd/idxd.o 00:03:19.664 CC lib/idxd/idxd_kernel.o 00:03:19.923 LIB libspdk_rdma_provider.a 00:03:19.923 LIB libspdk_conf.a 00:03:19.923 LIB libspdk_json.a 00:03:19.923 LIB libspdk_rdma_utils.a 00:03:20.182 LIB libspdk_idxd.a 00:03:20.182 LIB libspdk_vmd.a 00:03:20.182 CC lib/jsonrpc/jsonrpc_server.o 00:03:20.182 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:20.182 CC lib/jsonrpc/jsonrpc_client.o 00:03:20.182 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:20.440 LIB libspdk_jsonrpc.a 00:03:20.699 LIB libspdk_env_dpdk.a 00:03:20.699 CC lib/rpc/rpc.o 00:03:20.958 LIB libspdk_rpc.a 00:03:21.217 CC lib/keyring/keyring.o 00:03:21.217 CC lib/keyring/keyring_rpc.o 00:03:21.217 CC lib/trace/trace_rpc.o 00:03:21.217 CC lib/trace/trace.o 00:03:21.217 CC lib/trace/trace_flags.o 00:03:21.217 CC lib/notify/notify.o 00:03:21.217 CC lib/notify/notify_rpc.o 00:03:21.217 LIB libspdk_notify.a 00:03:21.217 LIB libspdk_keyring.a 00:03:21.217 LIB libspdk_trace.a 00:03:21.476 CC lib/sock/sock.o 00:03:21.476 CC lib/sock/sock_rpc.o 00:03:21.735 CC lib/thread/thread.o 00:03:21.735 CC lib/thread/iobuf.o 00:03:21.735 LIB libspdk_sock.a 00:03:21.994 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:21.994 CC lib/nvme/nvme_ctrlr.o 00:03:21.994 CC lib/nvme/nvme_fabric.o 00:03:21.994 CC lib/nvme/nvme_ns_cmd.o 00:03:21.994 CC lib/nvme/nvme_ns.o 00:03:21.994 CC lib/nvme/nvme_pcie_common.o 00:03:21.994 CC lib/nvme/nvme_pcie.o 00:03:21.994 CC lib/nvme/nvme.o 00:03:21.994 CC lib/nvme/nvme_qpair.o 00:03:21.994 CC lib/nvme/nvme_quirks.o 00:03:21.994 CC lib/nvme/nvme_transport.o 00:03:21.994 CC lib/nvme/nvme_discovery.o 00:03:21.994 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:21.994 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:21.994 CC lib/nvme/nvme_opal.o 00:03:21.994 CC lib/nvme/nvme_poll_group.o 00:03:21.994 CC lib/nvme/nvme_tcp.o 00:03:21.994 CC lib/nvme/nvme_io_msg.o 00:03:21.994 CC lib/nvme/nvme_zns.o 00:03:21.994 CC lib/nvme/nvme_stubs.o 00:03:21.994 CC lib/nvme/nvme_auth.o 00:03:21.994 CC lib/nvme/nvme_cuse.o 00:03:21.994 CC lib/nvme/nvme_vfio_user.o 00:03:21.994 CC lib/nvme/nvme_rdma.o 00:03:22.253 LIB libspdk_thread.a 00:03:22.511 CC lib/blob/request.o 00:03:22.511 CC lib/blob/blobstore.o 00:03:22.511 CC lib/blob/blob_bs_dev.o 00:03:22.511 CC lib/blob/zeroes.o 00:03:22.511 CC lib/fsdev/fsdev.o 00:03:22.511 CC lib/fsdev/fsdev_io.o 00:03:22.511 CC lib/fsdev/fsdev_rpc.o 00:03:22.511 CC lib/init/json_config.o 00:03:22.511 CC lib/init/subsystem_rpc.o 00:03:22.511 CC lib/init/subsystem.o 00:03:22.770 CC lib/virtio/virtio_vhost_user.o 00:03:22.770 CC lib/init/rpc.o 00:03:22.770 CC lib/virtio/virtio.o 00:03:22.770 CC lib/accel/accel.o 00:03:22.770 CC lib/accel/accel_rpc.o 00:03:22.770 CC lib/virtio/virtio_vfio_user.o 00:03:22.770 CC lib/accel/accel_sw.o 00:03:22.770 CC lib/virtio/virtio_pci.o 00:03:22.770 CC lib/vfu_tgt/tgt_endpoint.o 00:03:22.770 CC lib/vfu_tgt/tgt_rpc.o 00:03:22.770 LIB libspdk_init.a 00:03:22.770 LIB libspdk_virtio.a 00:03:22.770 LIB libspdk_vfu_tgt.a 00:03:23.029 LIB libspdk_fsdev.a 00:03:23.029 CC lib/event/log_rpc.o 00:03:23.029 CC lib/event/app.o 00:03:23.029 CC lib/event/reactor.o 00:03:23.029 CC lib/event/scheduler_static.o 00:03:23.029 CC lib/event/app_rpc.o 00:03:23.289 CC lib/fuse_dispatcher/fuse_dispatcher.o 00:03:23.289 LIB libspdk_event.a 00:03:23.289 LIB libspdk_accel.a 00:03:23.289 LIB libspdk_nvme.a 00:03:23.548 LIB libspdk_fuse_dispatcher.a 00:03:23.807 CC lib/bdev/bdev.o 00:03:23.807 CC lib/bdev/part.o 00:03:23.808 CC lib/bdev/bdev_rpc.o 00:03:23.808 CC lib/bdev/bdev_zone.o 00:03:23.808 CC lib/bdev/scsi_nvme.o 00:03:24.376 LIB libspdk_blob.a 00:03:24.635 CC lib/lvol/lvol.o 00:03:24.635 CC lib/blobfs/tree.o 00:03:24.635 CC lib/blobfs/blobfs.o 00:03:25.204 LIB libspdk_lvol.a 00:03:25.204 LIB libspdk_blobfs.a 00:03:25.462 LIB libspdk_bdev.a 00:03:25.720 CC lib/scsi/dev.o 00:03:25.720 CC lib/scsi/lun.o 00:03:25.720 CC lib/scsi/port.o 00:03:25.720 CC lib/scsi/scsi.o 00:03:25.720 CC lib/scsi/scsi_bdev.o 00:03:25.720 CC lib/scsi/scsi_pr.o 00:03:25.720 CC lib/scsi/scsi_rpc.o 00:03:25.720 CC lib/scsi/task.o 00:03:25.720 CC lib/ftl/ftl_core.o 00:03:25.720 CC lib/ftl/ftl_init.o 00:03:25.720 CC lib/ftl/ftl_layout.o 00:03:25.720 CC lib/ftl/ftl_debug.o 00:03:25.720 CC lib/ftl/ftl_l2p_flat.o 00:03:25.720 CC lib/ftl/ftl_io.o 00:03:25.720 CC lib/ftl/ftl_sb.o 00:03:25.720 CC lib/ftl/ftl_l2p.o 00:03:25.720 CC lib/ftl/ftl_band.o 00:03:25.720 CC lib/ftl/ftl_nv_cache.o 00:03:25.720 CC lib/ftl/ftl_band_ops.o 00:03:25.720 CC lib/ftl/ftl_writer.o 00:03:25.720 CC lib/ftl/ftl_rq.o 00:03:25.720 CC lib/ftl/ftl_reloc.o 00:03:25.720 CC lib/ftl/ftl_l2p_cache.o 00:03:25.720 CC lib/ftl/ftl_p2l.o 00:03:25.720 CC lib/ftl/ftl_p2l_log.o 00:03:25.720 CC lib/ftl/mngt/ftl_mngt.o 00:03:25.720 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:25.720 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:25.720 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:25.720 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:25.720 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:25.720 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:25.720 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:25.720 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:25.720 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:25.720 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:25.720 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:25.720 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:25.720 CC lib/ftl/utils/ftl_conf.o 00:03:25.720 CC lib/ftl/utils/ftl_md.o 00:03:25.720 CC lib/nvmf/ctrlr.o 00:03:25.720 CC lib/ublk/ublk.o 00:03:25.720 CC lib/nbd/nbd.o 00:03:25.720 CC lib/ublk/ublk_rpc.o 00:03:25.720 CC lib/ftl/utils/ftl_mempool.o 00:03:25.720 CC lib/nbd/nbd_rpc.o 00:03:25.720 CC lib/nvmf/ctrlr_discovery.o 00:03:25.720 CC lib/ftl/utils/ftl_bitmap.o 00:03:25.720 CC lib/nvmf/ctrlr_bdev.o 00:03:25.720 CC lib/ftl/utils/ftl_property.o 00:03:25.720 CC lib/nvmf/subsystem.o 00:03:25.721 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:25.721 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:25.721 CC lib/nvmf/nvmf.o 00:03:25.721 CC lib/nvmf/transport.o 00:03:25.721 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:25.721 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:25.721 CC lib/nvmf/nvmf_rpc.o 00:03:25.721 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:25.721 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:25.721 CC lib/nvmf/vfio_user.o 00:03:25.721 CC lib/nvmf/stubs.o 00:03:25.721 CC lib/nvmf/mdns_server.o 00:03:25.721 CC lib/nvmf/tcp.o 00:03:25.721 CC lib/nvmf/rdma.o 00:03:25.721 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:03:25.721 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:25.721 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:25.721 CC lib/nvmf/auth.o 00:03:25.721 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:25.721 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:25.721 CC lib/ftl/nvc/ftl_nvc_bdev_non_vss.o 00:03:25.721 CC lib/ftl/nvc/ftl_nvc_bdev_common.o 00:03:25.721 CC lib/ftl/base/ftl_base_dev.o 00:03:25.721 CC lib/ftl/base/ftl_base_bdev.o 00:03:25.721 CC lib/ftl/ftl_trace.o 00:03:25.980 LIB libspdk_nbd.a 00:03:26.237 LIB libspdk_scsi.a 00:03:26.237 LIB libspdk_ublk.a 00:03:26.496 LIB libspdk_ftl.a 00:03:26.496 CC lib/iscsi/init_grp.o 00:03:26.496 CC lib/vhost/vhost.o 00:03:26.496 CC lib/iscsi/conn.o 00:03:26.496 CC lib/iscsi/param.o 00:03:26.496 CC lib/vhost/vhost_blk.o 00:03:26.496 CC lib/vhost/vhost_rpc.o 00:03:26.496 CC lib/iscsi/portal_grp.o 00:03:26.496 CC lib/iscsi/iscsi.o 00:03:26.496 CC lib/vhost/vhost_scsi.o 00:03:26.496 CC lib/vhost/rte_vhost_user.o 00:03:26.496 CC lib/iscsi/tgt_node.o 00:03:26.496 CC lib/iscsi/iscsi_subsystem.o 00:03:26.496 CC lib/iscsi/iscsi_rpc.o 00:03:26.496 CC lib/iscsi/task.o 00:03:27.064 LIB libspdk_nvmf.a 00:03:27.064 LIB libspdk_vhost.a 00:03:27.323 LIB libspdk_iscsi.a 00:03:27.582 CC module/env_dpdk/env_dpdk_rpc.o 00:03:27.582 CC module/vfu_device/vfu_virtio_blk.o 00:03:27.582 CC module/vfu_device/vfu_virtio.o 00:03:27.582 CC module/vfu_device/vfu_virtio_scsi.o 00:03:27.582 CC module/vfu_device/vfu_virtio_fs.o 00:03:27.582 CC module/vfu_device/vfu_virtio_rpc.o 00:03:27.840 LIB libspdk_env_dpdk_rpc.a 00:03:27.840 CC module/blob/bdev/blob_bdev.o 00:03:27.841 CC module/sock/posix/posix.o 00:03:27.841 CC module/fsdev/aio/linux_aio_mgr.o 00:03:27.841 CC module/fsdev/aio/fsdev_aio_rpc.o 00:03:27.841 CC module/keyring/linux/keyring.o 00:03:27.841 CC module/fsdev/aio/fsdev_aio.o 00:03:27.841 CC module/accel/ioat/accel_ioat_rpc.o 00:03:27.841 CC module/accel/ioat/accel_ioat.o 00:03:27.841 CC module/keyring/linux/keyring_rpc.o 00:03:27.841 CC module/keyring/file/keyring.o 00:03:27.841 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:27.841 CC module/keyring/file/keyring_rpc.o 00:03:27.841 CC module/accel/error/accel_error.o 00:03:27.841 CC module/scheduler/gscheduler/gscheduler.o 00:03:27.841 CC module/accel/iaa/accel_iaa.o 00:03:27.841 CC module/accel/error/accel_error_rpc.o 00:03:27.841 CC module/accel/iaa/accel_iaa_rpc.o 00:03:27.841 CC module/accel/dsa/accel_dsa.o 00:03:27.841 CC module/accel/dsa/accel_dsa_rpc.o 00:03:27.841 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:27.841 LIB libspdk_keyring_file.a 00:03:27.841 LIB libspdk_keyring_linux.a 00:03:27.841 LIB libspdk_accel_ioat.a 00:03:27.841 LIB libspdk_scheduler_gscheduler.a 00:03:27.841 LIB libspdk_scheduler_dpdk_governor.a 00:03:27.841 LIB libspdk_scheduler_dynamic.a 00:03:27.841 LIB libspdk_accel_error.a 00:03:27.841 LIB libspdk_accel_iaa.a 00:03:28.099 LIB libspdk_blob_bdev.a 00:03:28.099 LIB libspdk_accel_dsa.a 00:03:28.099 LIB libspdk_vfu_device.a 00:03:28.099 LIB libspdk_fsdev_aio.a 00:03:28.357 LIB libspdk_sock_posix.a 00:03:28.357 CC module/blobfs/bdev/blobfs_bdev.o 00:03:28.357 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:28.357 CC module/bdev/error/vbdev_error.o 00:03:28.357 CC module/bdev/error/vbdev_error_rpc.o 00:03:28.357 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:28.357 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:28.357 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:28.357 CC module/bdev/passthru/vbdev_passthru.o 00:03:28.357 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:28.357 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:28.357 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:28.357 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:28.357 CC module/bdev/delay/vbdev_delay.o 00:03:28.357 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:28.357 CC module/bdev/ftl/bdev_ftl.o 00:03:28.357 CC module/bdev/gpt/gpt.o 00:03:28.357 CC module/bdev/split/vbdev_split.o 00:03:28.357 CC module/bdev/nvme/bdev_nvme.o 00:03:28.357 CC module/bdev/gpt/vbdev_gpt.o 00:03:28.357 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:28.357 CC module/bdev/nvme/nvme_rpc.o 00:03:28.357 CC module/bdev/split/vbdev_split_rpc.o 00:03:28.357 CC module/bdev/malloc/bdev_malloc.o 00:03:28.357 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:28.357 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:28.357 CC module/bdev/nvme/bdev_mdns_client.o 00:03:28.357 CC module/bdev/aio/bdev_aio.o 00:03:28.357 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:28.357 CC module/bdev/iscsi/bdev_iscsi.o 00:03:28.357 CC module/bdev/nvme/vbdev_opal.o 00:03:28.357 CC module/bdev/lvol/vbdev_lvol.o 00:03:28.357 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:28.357 CC module/bdev/raid/bdev_raid.o 00:03:28.357 CC module/bdev/aio/bdev_aio_rpc.o 00:03:28.357 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:28.357 CC module/bdev/raid/bdev_raid_rpc.o 00:03:28.357 CC module/bdev/raid/bdev_raid_sb.o 00:03:28.358 CC module/bdev/raid/raid0.o 00:03:28.358 CC module/bdev/raid/raid1.o 00:03:28.358 CC module/bdev/raid/concat.o 00:03:28.358 CC module/bdev/null/bdev_null.o 00:03:28.358 CC module/bdev/null/bdev_null_rpc.o 00:03:28.616 LIB libspdk_blobfs_bdev.a 00:03:28.616 LIB libspdk_bdev_error.a 00:03:28.616 LIB libspdk_bdev_split.a 00:03:28.616 LIB libspdk_bdev_gpt.a 00:03:28.616 LIB libspdk_bdev_passthru.a 00:03:28.616 LIB libspdk_bdev_zone_block.a 00:03:28.616 LIB libspdk_bdev_ftl.a 00:03:28.616 LIB libspdk_bdev_null.a 00:03:28.616 LIB libspdk_bdev_aio.a 00:03:28.616 LIB libspdk_bdev_iscsi.a 00:03:28.616 LIB libspdk_bdev_delay.a 00:03:28.616 LIB libspdk_bdev_malloc.a 00:03:28.875 LIB libspdk_bdev_lvol.a 00:03:28.875 LIB libspdk_bdev_virtio.a 00:03:29.134 LIB libspdk_bdev_raid.a 00:03:29.702 LIB libspdk_bdev_nvme.a 00:03:30.271 CC module/event/subsystems/keyring/keyring.o 00:03:30.271 CC module/event/subsystems/vmd/vmd.o 00:03:30.271 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:30.271 CC module/event/subsystems/scheduler/scheduler.o 00:03:30.271 CC module/event/subsystems/fsdev/fsdev.o 00:03:30.271 CC module/event/subsystems/sock/sock.o 00:03:30.271 CC module/event/subsystems/iobuf/iobuf.o 00:03:30.271 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:30.271 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:30.271 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:30.271 LIB libspdk_event_keyring.a 00:03:30.271 LIB libspdk_event_scheduler.a 00:03:30.271 LIB libspdk_event_vmd.a 00:03:30.271 LIB libspdk_event_fsdev.a 00:03:30.271 LIB libspdk_event_vfu_tgt.a 00:03:30.271 LIB libspdk_event_sock.a 00:03:30.530 LIB libspdk_event_vhost_blk.a 00:03:30.530 LIB libspdk_event_iobuf.a 00:03:30.790 CC module/event/subsystems/accel/accel.o 00:03:30.790 LIB libspdk_event_accel.a 00:03:31.050 CC module/event/subsystems/bdev/bdev.o 00:03:31.309 LIB libspdk_event_bdev.a 00:03:31.567 CC module/event/subsystems/nbd/nbd.o 00:03:31.567 CC module/event/subsystems/scsi/scsi.o 00:03:31.567 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:31.567 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:31.567 CC module/event/subsystems/ublk/ublk.o 00:03:31.568 LIB libspdk_event_nbd.a 00:03:31.568 LIB libspdk_event_scsi.a 00:03:31.568 LIB libspdk_event_ublk.a 00:03:31.827 LIB libspdk_event_nvmf.a 00:03:32.086 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:32.086 CC module/event/subsystems/iscsi/iscsi.o 00:03:32.086 LIB libspdk_event_vhost_scsi.a 00:03:32.086 LIB libspdk_event_iscsi.a 00:03:32.345 CXX app/trace/trace.o 00:03:32.345 CC app/trace_record/trace_record.o 00:03:32.345 CC app/spdk_nvme_perf/perf.o 00:03:32.345 CC app/spdk_lspci/spdk_lspci.o 00:03:32.345 CC app/spdk_nvme_identify/identify.o 00:03:32.345 CC app/spdk_top/spdk_top.o 00:03:32.345 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:32.345 CC test/rpc_client/rpc_client_test.o 00:03:32.345 CC app/spdk_nvme_discover/discovery_aer.o 00:03:32.345 TEST_HEADER include/spdk/accel_module.h 00:03:32.345 TEST_HEADER include/spdk/accel.h 00:03:32.345 TEST_HEADER include/spdk/barrier.h 00:03:32.345 TEST_HEADER include/spdk/assert.h 00:03:32.345 TEST_HEADER include/spdk/bdev.h 00:03:32.345 TEST_HEADER include/spdk/base64.h 00:03:32.345 TEST_HEADER include/spdk/bdev_module.h 00:03:32.345 TEST_HEADER include/spdk/bdev_zone.h 00:03:32.345 TEST_HEADER include/spdk/bit_pool.h 00:03:32.345 TEST_HEADER include/spdk/bit_array.h 00:03:32.345 TEST_HEADER include/spdk/blob_bdev.h 00:03:32.345 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:32.345 TEST_HEADER include/spdk/blob.h 00:03:32.345 TEST_HEADER include/spdk/blobfs.h 00:03:32.611 TEST_HEADER include/spdk/conf.h 00:03:32.611 TEST_HEADER include/spdk/cpuset.h 00:03:32.611 TEST_HEADER include/spdk/crc16.h 00:03:32.611 TEST_HEADER include/spdk/crc32.h 00:03:32.611 TEST_HEADER include/spdk/config.h 00:03:32.611 TEST_HEADER include/spdk/dma.h 00:03:32.611 TEST_HEADER include/spdk/crc64.h 00:03:32.611 TEST_HEADER include/spdk/dif.h 00:03:32.611 TEST_HEADER include/spdk/env_dpdk.h 00:03:32.611 TEST_HEADER include/spdk/endian.h 00:03:32.611 TEST_HEADER include/spdk/event.h 00:03:32.611 TEST_HEADER include/spdk/env.h 00:03:32.611 TEST_HEADER include/spdk/fd.h 00:03:32.611 TEST_HEADER include/spdk/fd_group.h 00:03:32.611 TEST_HEADER include/spdk/file.h 00:03:32.611 TEST_HEADER include/spdk/fuse_dispatcher.h 00:03:32.611 TEST_HEADER include/spdk/fsdev.h 00:03:32.611 TEST_HEADER include/spdk/hexlify.h 00:03:32.611 TEST_HEADER include/spdk/fsdev_module.h 00:03:32.611 TEST_HEADER include/spdk/ftl.h 00:03:32.611 CC app/iscsi_tgt/iscsi_tgt.o 00:03:32.611 TEST_HEADER include/spdk/gpt_spec.h 00:03:32.611 TEST_HEADER include/spdk/idxd_spec.h 00:03:32.611 TEST_HEADER include/spdk/histogram_data.h 00:03:32.611 TEST_HEADER include/spdk/idxd.h 00:03:32.611 TEST_HEADER include/spdk/ioat_spec.h 00:03:32.611 TEST_HEADER include/spdk/json.h 00:03:32.611 TEST_HEADER include/spdk/init.h 00:03:32.611 TEST_HEADER include/spdk/ioat.h 00:03:32.611 TEST_HEADER include/spdk/keyring.h 00:03:32.611 TEST_HEADER include/spdk/iscsi_spec.h 00:03:32.611 TEST_HEADER include/spdk/jsonrpc.h 00:03:32.611 TEST_HEADER include/spdk/log.h 00:03:32.611 TEST_HEADER include/spdk/lvol.h 00:03:32.611 TEST_HEADER include/spdk/keyring_module.h 00:03:32.611 TEST_HEADER include/spdk/md5.h 00:03:32.611 TEST_HEADER include/spdk/likely.h 00:03:32.611 CC app/nvmf_tgt/nvmf_main.o 00:03:32.611 TEST_HEADER include/spdk/mmio.h 00:03:32.611 TEST_HEADER include/spdk/nbd.h 00:03:32.611 TEST_HEADER include/spdk/memory.h 00:03:32.611 TEST_HEADER include/spdk/notify.h 00:03:32.611 TEST_HEADER include/spdk/net.h 00:03:32.611 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:32.611 TEST_HEADER include/spdk/nvme_intel.h 00:03:32.611 CC app/spdk_dd/spdk_dd.o 00:03:32.611 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:32.611 TEST_HEADER include/spdk/nvme.h 00:03:32.611 TEST_HEADER include/spdk/nvme_spec.h 00:03:32.611 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:32.611 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:32.611 TEST_HEADER include/spdk/nvme_zns.h 00:03:32.611 TEST_HEADER include/spdk/nvmf.h 00:03:32.611 TEST_HEADER include/spdk/pci_ids.h 00:03:32.611 TEST_HEADER include/spdk/nvmf_spec.h 00:03:32.611 TEST_HEADER include/spdk/nvmf_transport.h 00:03:32.611 TEST_HEADER include/spdk/opal_spec.h 00:03:32.611 TEST_HEADER include/spdk/opal.h 00:03:32.611 TEST_HEADER include/spdk/reduce.h 00:03:32.611 TEST_HEADER include/spdk/queue.h 00:03:32.611 TEST_HEADER include/spdk/pipe.h 00:03:32.611 CC app/spdk_tgt/spdk_tgt.o 00:03:32.611 TEST_HEADER include/spdk/rpc.h 00:03:32.611 TEST_HEADER include/spdk/scheduler.h 00:03:32.611 TEST_HEADER include/spdk/scsi.h 00:03:32.611 TEST_HEADER include/spdk/sock.h 00:03:32.611 TEST_HEADER include/spdk/scsi_spec.h 00:03:32.611 TEST_HEADER include/spdk/stdinc.h 00:03:32.611 TEST_HEADER include/spdk/string.h 00:03:32.611 TEST_HEADER include/spdk/trace.h 00:03:32.611 TEST_HEADER include/spdk/trace_parser.h 00:03:32.611 TEST_HEADER include/spdk/thread.h 00:03:32.611 TEST_HEADER include/spdk/tree.h 00:03:32.611 TEST_HEADER include/spdk/util.h 00:03:32.611 TEST_HEADER include/spdk/ublk.h 00:03:32.611 TEST_HEADER include/spdk/uuid.h 00:03:32.611 TEST_HEADER include/spdk/version.h 00:03:32.611 TEST_HEADER include/spdk/vhost.h 00:03:32.611 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:32.611 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:32.611 TEST_HEADER include/spdk/vmd.h 00:03:32.611 TEST_HEADER include/spdk/xor.h 00:03:32.611 TEST_HEADER include/spdk/zipf.h 00:03:32.611 CXX test/cpp_headers/assert.o 00:03:32.611 CXX test/cpp_headers/accel_module.o 00:03:32.611 CXX test/cpp_headers/accel.o 00:03:32.611 CXX test/cpp_headers/bdev.o 00:03:32.611 CXX test/cpp_headers/barrier.o 00:03:32.611 CXX test/cpp_headers/base64.o 00:03:32.611 CXX test/cpp_headers/bdev_zone.o 00:03:32.611 CXX test/cpp_headers/bit_array.o 00:03:32.611 CXX test/cpp_headers/bdev_module.o 00:03:32.611 CXX test/cpp_headers/blob_bdev.o 00:03:32.611 CXX test/cpp_headers/bit_pool.o 00:03:32.611 CXX test/cpp_headers/conf.o 00:03:32.611 CXX test/cpp_headers/blobfs_bdev.o 00:03:32.611 CXX test/cpp_headers/blobfs.o 00:03:32.611 CXX test/cpp_headers/config.o 00:03:32.611 CXX test/cpp_headers/cpuset.o 00:03:32.611 CXX test/cpp_headers/blob.o 00:03:32.611 CC examples/util/zipf/zipf.o 00:03:32.611 CXX test/cpp_headers/crc64.o 00:03:32.611 CXX test/cpp_headers/crc16.o 00:03:32.611 CXX test/cpp_headers/crc32.o 00:03:32.611 CC examples/ioat/perf/perf.o 00:03:32.611 CXX test/cpp_headers/dif.o 00:03:32.611 CXX test/cpp_headers/endian.o 00:03:32.611 CXX test/cpp_headers/dma.o 00:03:32.611 CXX test/cpp_headers/env.o 00:03:32.611 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:32.611 CXX test/cpp_headers/fd_group.o 00:03:32.611 CXX test/cpp_headers/env_dpdk.o 00:03:32.611 CXX test/cpp_headers/fd.o 00:03:32.611 CXX test/cpp_headers/event.o 00:03:32.611 CXX test/cpp_headers/file.o 00:03:32.611 CXX test/cpp_headers/fsdev.o 00:03:32.611 CXX test/cpp_headers/fsdev_module.o 00:03:32.611 CXX test/cpp_headers/ftl.o 00:03:32.611 CXX test/cpp_headers/fuse_dispatcher.o 00:03:32.611 CXX test/cpp_headers/hexlify.o 00:03:32.611 CXX test/cpp_headers/gpt_spec.o 00:03:32.611 CXX test/cpp_headers/histogram_data.o 00:03:32.611 CXX test/cpp_headers/idxd.o 00:03:32.611 CXX test/cpp_headers/idxd_spec.o 00:03:32.611 CXX test/cpp_headers/ioat.o 00:03:32.611 CXX test/cpp_headers/init.o 00:03:32.611 CXX test/cpp_headers/ioat_spec.o 00:03:32.611 CXX test/cpp_headers/iscsi_spec.o 00:03:32.611 CXX test/cpp_headers/json.o 00:03:32.611 CC examples/ioat/verify/verify.o 00:03:32.611 CXX test/cpp_headers/keyring.o 00:03:32.611 CXX test/cpp_headers/jsonrpc.o 00:03:32.611 CXX test/cpp_headers/keyring_module.o 00:03:32.611 CXX test/cpp_headers/log.o 00:03:32.611 CXX test/cpp_headers/likely.o 00:03:32.611 CC test/app/histogram_perf/histogram_perf.o 00:03:32.611 CXX test/cpp_headers/lvol.o 00:03:32.611 CXX test/cpp_headers/md5.o 00:03:32.611 CXX test/cpp_headers/memory.o 00:03:32.611 CXX test/cpp_headers/mmio.o 00:03:32.611 CXX test/cpp_headers/nbd.o 00:03:32.611 CXX test/cpp_headers/net.o 00:03:32.611 CXX test/cpp_headers/notify.o 00:03:32.611 CXX test/cpp_headers/nvme.o 00:03:32.611 CC test/app/jsoncat/jsoncat.o 00:03:32.611 CXX test/cpp_headers/nvme_intel.o 00:03:32.611 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:32.611 CXX test/cpp_headers/nvme_ocssd.o 00:03:32.611 CC test/env/memory/memory_ut.o 00:03:32.611 CXX test/cpp_headers/nvme_spec.o 00:03:32.611 CXX test/cpp_headers/nvme_zns.o 00:03:32.611 CXX test/cpp_headers/nvmf_cmd.o 00:03:32.611 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:32.611 CXX test/cpp_headers/nvmf.o 00:03:32.611 CC test/env/vtophys/vtophys.o 00:03:32.611 CXX test/cpp_headers/nvmf_spec.o 00:03:32.611 CXX test/cpp_headers/nvmf_transport.o 00:03:32.611 LINK spdk_lspci 00:03:32.611 CXX test/cpp_headers/opal.o 00:03:32.611 CXX test/cpp_headers/pci_ids.o 00:03:32.611 CXX test/cpp_headers/opal_spec.o 00:03:32.611 CXX test/cpp_headers/queue.o 00:03:32.611 CXX test/cpp_headers/pipe.o 00:03:32.611 CC test/thread/poller_perf/poller_perf.o 00:03:32.611 CXX test/cpp_headers/scheduler.o 00:03:32.611 CXX test/cpp_headers/reduce.o 00:03:32.611 CXX test/cpp_headers/scsi.o 00:03:32.611 CXX test/cpp_headers/rpc.o 00:03:32.611 CXX test/cpp_headers/scsi_spec.o 00:03:32.611 CC test/app/stub/stub.o 00:03:32.611 CXX test/cpp_headers/sock.o 00:03:32.611 CXX test/cpp_headers/stdinc.o 00:03:32.611 CC test/env/pci/pci_ut.o 00:03:32.611 CC test/thread/lock/spdk_lock.o 00:03:32.611 CC app/fio/nvme/fio_plugin.o 00:03:32.611 CXX test/cpp_headers/string.o 00:03:32.611 LINK interrupt_tgt 00:03:32.611 CC app/fio/bdev/fio_plugin.o 00:03:32.611 CC test/app/bdev_svc/bdev_svc.o 00:03:32.611 LINK rpc_client_test 00:03:32.611 CXX test/cpp_headers/thread.o 00:03:32.611 CC test/dma/test_dma/test_dma.o 00:03:32.611 LINK spdk_trace_record 00:03:32.612 CC test/env/mem_callbacks/mem_callbacks.o 00:03:32.612 LINK spdk_nvme_discover 00:03:32.612 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:32.612 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:32.612 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:32.875 LINK nvmf_tgt 00:03:32.875 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:32.875 CXX test/cpp_headers/trace.o 00:03:32.875 CXX test/cpp_headers/tree.o 00:03:32.875 LINK env_dpdk_post_init 00:03:32.876 CXX test/cpp_headers/trace_parser.o 00:03:32.876 LINK zipf 00:03:32.876 CXX test/cpp_headers/util.o 00:03:32.876 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:32.876 LINK jsoncat 00:03:32.876 CXX test/cpp_headers/ublk.o 00:03:32.876 CXX test/cpp_headers/uuid.o 00:03:32.876 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:32.876 LINK histogram_perf 00:03:32.876 CXX test/cpp_headers/version.o 00:03:32.876 CXX test/cpp_headers/vfio_user_pci.o 00:03:32.876 CXX test/cpp_headers/vfio_user_spec.o 00:03:32.876 LINK iscsi_tgt 00:03:32.876 LINK poller_perf 00:03:32.876 CXX test/cpp_headers/vhost.o 00:03:32.876 CXX test/cpp_headers/vmd.o 00:03:32.876 CXX test/cpp_headers/xor.o 00:03:32.876 CXX test/cpp_headers/zipf.o 00:03:32.876 LINK vtophys 00:03:32.876 LINK ioat_perf 00:03:32.876 LINK spdk_tgt 00:03:32.876 LINK verify 00:03:32.876 LINK stub 00:03:32.876 LINK spdk_trace 00:03:32.876 LINK bdev_svc 00:03:32.876 LINK pci_ut 00:03:33.137 LINK llvm_vfio_fuzz 00:03:33.138 LINK nvme_fuzz 00:03:33.138 LINK test_dma 00:03:33.138 LINK spdk_dd 00:03:33.138 LINK vhost_fuzz 00:03:33.138 LINK spdk_nvme 00:03:33.138 LINK spdk_bdev 00:03:33.138 LINK spdk_nvme_identify 00:03:33.138 LINK spdk_nvme_perf 00:03:33.138 LINK mem_callbacks 00:03:33.138 LINK spdk_top 00:03:33.138 LINK llvm_nvme_fuzz 00:03:33.396 CC app/vhost/vhost.o 00:03:33.396 CC examples/vmd/lsvmd/lsvmd.o 00:03:33.396 CC examples/vmd/led/led.o 00:03:33.396 CC examples/sock/hello_world/hello_sock.o 00:03:33.396 CC examples/idxd/perf/perf.o 00:03:33.396 CC examples/thread/thread/thread_ex.o 00:03:33.396 LINK vhost 00:03:33.396 LINK lsvmd 00:03:33.655 LINK led 00:03:33.656 LINK memory_ut 00:03:33.656 LINK hello_sock 00:03:33.656 LINK spdk_lock 00:03:33.656 LINK idxd_perf 00:03:33.656 LINK thread 00:03:33.915 LINK iscsi_fuzz 00:03:34.173 CC test/event/reactor_perf/reactor_perf.o 00:03:34.173 CC examples/nvme/reconnect/reconnect.o 00:03:34.173 CC examples/nvme/hello_world/hello_world.o 00:03:34.173 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:34.173 CC test/event/reactor/reactor.o 00:03:34.173 CC test/event/event_perf/event_perf.o 00:03:34.430 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:34.430 CC examples/nvme/arbitration/arbitration.o 00:03:34.430 CC examples/nvme/abort/abort.o 00:03:34.430 CC test/event/app_repeat/app_repeat.o 00:03:34.430 CC examples/nvme/hotplug/hotplug.o 00:03:34.430 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:34.430 CC test/event/scheduler/scheduler.o 00:03:34.430 LINK reactor_perf 00:03:34.430 LINK event_perf 00:03:34.430 LINK reactor 00:03:34.430 LINK pmr_persistence 00:03:34.430 LINK hello_world 00:03:34.430 LINK app_repeat 00:03:34.430 LINK cmb_copy 00:03:34.430 LINK hotplug 00:03:34.430 LINK reconnect 00:03:34.430 LINK scheduler 00:03:34.430 LINK abort 00:03:34.430 LINK arbitration 00:03:34.688 LINK nvme_manage 00:03:34.688 CC test/nvme/overhead/overhead.o 00:03:34.688 CC test/nvme/cuse/cuse.o 00:03:34.688 CC test/nvme/sgl/sgl.o 00:03:34.688 CC test/nvme/e2edp/nvme_dp.o 00:03:34.688 CC test/nvme/startup/startup.o 00:03:34.688 CC test/nvme/boot_partition/boot_partition.o 00:03:34.688 CC test/nvme/connect_stress/connect_stress.o 00:03:34.688 CC test/nvme/reset/reset.o 00:03:34.688 CC test/nvme/compliance/nvme_compliance.o 00:03:34.688 CC test/nvme/fdp/fdp.o 00:03:34.688 CC test/nvme/err_injection/err_injection.o 00:03:34.688 CC test/nvme/reserve/reserve.o 00:03:34.689 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:34.689 CC test/nvme/simple_copy/simple_copy.o 00:03:34.689 CC test/nvme/aer/aer.o 00:03:34.689 CC test/nvme/fused_ordering/fused_ordering.o 00:03:34.689 CC test/blobfs/mkfs/mkfs.o 00:03:34.689 CC test/accel/dif/dif.o 00:03:34.947 CC test/lvol/esnap/esnap.o 00:03:34.947 LINK boot_partition 00:03:34.947 LINK startup 00:03:34.947 LINK connect_stress 00:03:34.947 LINK err_injection 00:03:34.947 LINK doorbell_aers 00:03:34.947 LINK reserve 00:03:34.947 LINK fused_ordering 00:03:34.947 LINK simple_copy 00:03:34.947 LINK nvme_dp 00:03:34.947 LINK sgl 00:03:34.947 LINK overhead 00:03:34.947 LINK fdp 00:03:34.947 LINK aer 00:03:34.947 LINK reset 00:03:34.947 LINK mkfs 00:03:34.947 LINK nvme_compliance 00:03:35.204 LINK dif 00:03:35.461 CC examples/accel/perf/accel_perf.o 00:03:35.461 CC examples/fsdev/hello_world/hello_fsdev.o 00:03:35.461 CC examples/blob/cli/blobcli.o 00:03:35.461 CC examples/blob/hello_world/hello_blob.o 00:03:35.461 LINK cuse 00:03:35.719 LINK hello_blob 00:03:35.719 LINK hello_fsdev 00:03:35.719 LINK accel_perf 00:03:35.719 LINK blobcli 00:03:36.292 CC examples/bdev/hello_world/hello_bdev.o 00:03:36.292 CC examples/bdev/bdevperf/bdevperf.o 00:03:36.551 LINK hello_bdev 00:03:36.810 CC test/bdev/bdevio/bdevio.o 00:03:36.810 LINK bdevperf 00:03:37.069 LINK bdevio 00:03:38.444 LINK esnap 00:03:38.444 CC examples/nvmf/nvmf/nvmf.o 00:03:38.704 LINK nvmf 00:03:40.081 00:03:40.081 real 0m36.323s 00:03:40.081 user 4m38.597s 00:03:40.081 sys 1m41.721s 00:03:40.081 08:15:52 make -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:03:40.082 08:15:52 make -- common/autotest_common.sh@10 -- $ set +x 00:03:40.082 ************************************ 00:03:40.082 END TEST make 00:03:40.082 ************************************ 00:03:40.082 08:15:52 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:03:40.082 08:15:52 -- pm/common@29 -- $ signal_monitor_resources TERM 00:03:40.082 08:15:52 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:03:40.082 08:15:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:40.082 08:15:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:03:40.082 08:15:52 -- pm/common@44 -- $ pid=848706 00:03:40.082 08:15:52 -- pm/common@50 -- $ kill -TERM 848706 00:03:40.082 08:15:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:40.082 08:15:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:03:40.082 08:15:52 -- pm/common@44 -- $ pid=848708 00:03:40.082 08:15:52 -- pm/common@50 -- $ kill -TERM 848708 00:03:40.082 08:15:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:40.082 08:15:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:03:40.082 08:15:52 -- pm/common@44 -- $ pid=848710 00:03:40.082 08:15:52 -- pm/common@50 -- $ kill -TERM 848710 00:03:40.082 08:15:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:40.082 08:15:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:03:40.082 08:15:52 -- pm/common@44 -- $ pid=848736 00:03:40.082 08:15:52 -- pm/common@50 -- $ sudo -E kill -TERM 848736 00:03:40.082 08:15:52 -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:03:40.082 08:15:52 -- common/autotest_common.sh@1681 -- # lcov --version 00:03:40.082 08:15:52 -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:03:40.082 08:15:53 -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:03:40.082 08:15:53 -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:03:40.082 08:15:53 -- scripts/common.sh@333 -- # local ver1 ver1_l 00:03:40.082 08:15:53 -- scripts/common.sh@334 -- # local ver2 ver2_l 00:03:40.082 08:15:53 -- scripts/common.sh@336 -- # IFS=.-: 00:03:40.082 08:15:53 -- scripts/common.sh@336 -- # read -ra ver1 00:03:40.082 08:15:53 -- scripts/common.sh@337 -- # IFS=.-: 00:03:40.082 08:15:53 -- scripts/common.sh@337 -- # read -ra ver2 00:03:40.082 08:15:53 -- scripts/common.sh@338 -- # local 'op=<' 00:03:40.082 08:15:53 -- scripts/common.sh@340 -- # ver1_l=2 00:03:40.082 08:15:53 -- scripts/common.sh@341 -- # ver2_l=1 00:03:40.082 08:15:53 -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:03:40.082 08:15:53 -- scripts/common.sh@344 -- # case "$op" in 00:03:40.082 08:15:53 -- scripts/common.sh@345 -- # : 1 00:03:40.082 08:15:53 -- scripts/common.sh@364 -- # (( v = 0 )) 00:03:40.082 08:15:53 -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:40.082 08:15:53 -- scripts/common.sh@365 -- # decimal 1 00:03:40.082 08:15:53 -- scripts/common.sh@353 -- # local d=1 00:03:40.082 08:15:53 -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:40.082 08:15:53 -- scripts/common.sh@355 -- # echo 1 00:03:40.082 08:15:53 -- scripts/common.sh@365 -- # ver1[v]=1 00:03:40.082 08:15:53 -- scripts/common.sh@366 -- # decimal 2 00:03:40.082 08:15:53 -- scripts/common.sh@353 -- # local d=2 00:03:40.082 08:15:53 -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:40.082 08:15:53 -- scripts/common.sh@355 -- # echo 2 00:03:40.082 08:15:53 -- scripts/common.sh@366 -- # ver2[v]=2 00:03:40.082 08:15:53 -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:03:40.082 08:15:53 -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:03:40.082 08:15:53 -- scripts/common.sh@368 -- # return 0 00:03:40.082 08:15:53 -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:40.082 08:15:53 -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:03:40.082 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:40.082 --rc genhtml_branch_coverage=1 00:03:40.082 --rc genhtml_function_coverage=1 00:03:40.082 --rc genhtml_legend=1 00:03:40.082 --rc geninfo_all_blocks=1 00:03:40.082 --rc geninfo_unexecuted_blocks=1 00:03:40.082 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:40.082 ' 00:03:40.082 08:15:53 -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:03:40.082 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:40.082 --rc genhtml_branch_coverage=1 00:03:40.082 --rc genhtml_function_coverage=1 00:03:40.082 --rc genhtml_legend=1 00:03:40.082 --rc geninfo_all_blocks=1 00:03:40.082 --rc geninfo_unexecuted_blocks=1 00:03:40.082 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:40.082 ' 00:03:40.082 08:15:53 -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:03:40.082 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:40.082 --rc genhtml_branch_coverage=1 00:03:40.082 --rc genhtml_function_coverage=1 00:03:40.082 --rc genhtml_legend=1 00:03:40.082 --rc geninfo_all_blocks=1 00:03:40.082 --rc geninfo_unexecuted_blocks=1 00:03:40.082 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:40.082 ' 00:03:40.082 08:15:53 -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:03:40.082 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:40.082 --rc genhtml_branch_coverage=1 00:03:40.082 --rc genhtml_function_coverage=1 00:03:40.082 --rc genhtml_legend=1 00:03:40.082 --rc geninfo_all_blocks=1 00:03:40.082 --rc geninfo_unexecuted_blocks=1 00:03:40.082 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:40.082 ' 00:03:40.082 08:15:53 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:40.082 08:15:53 -- nvmf/common.sh@7 -- # uname -s 00:03:40.082 08:15:53 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:40.082 08:15:53 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:40.082 08:15:53 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:40.082 08:15:53 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:40.082 08:15:53 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:40.082 08:15:53 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:40.082 08:15:53 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:40.082 08:15:53 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:40.082 08:15:53 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:40.082 08:15:53 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:40.082 08:15:53 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:03:40.082 08:15:53 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:03:40.082 08:15:53 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:40.082 08:15:53 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:40.082 08:15:53 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:40.082 08:15:53 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:03:40.082 08:15:53 -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:40.082 08:15:53 -- scripts/common.sh@15 -- # shopt -s extglob 00:03:40.082 08:15:53 -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:40.082 08:15:53 -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:40.082 08:15:53 -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:40.082 08:15:53 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:40.082 08:15:53 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:40.082 08:15:53 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:40.082 08:15:53 -- paths/export.sh@5 -- # export PATH 00:03:40.082 08:15:53 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:40.082 08:15:53 -- nvmf/common.sh@51 -- # : 0 00:03:40.082 08:15:53 -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:03:40.082 08:15:53 -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:03:40.082 08:15:53 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:03:40.082 08:15:53 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:40.082 08:15:53 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:40.082 08:15:53 -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:03:40.082 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:03:40.082 08:15:53 -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:03:40.082 08:15:53 -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:03:40.082 08:15:53 -- nvmf/common.sh@55 -- # have_pci_nics=0 00:03:40.082 08:15:53 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:40.082 08:15:53 -- spdk/autotest.sh@32 -- # uname -s 00:03:40.082 08:15:53 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:40.082 08:15:53 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:40.082 08:15:53 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:40.082 08:15:53 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:40.082 08:15:53 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:40.082 08:15:53 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:40.082 08:15:53 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:40.082 08:15:53 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:40.082 08:15:53 -- spdk/autotest.sh@48 -- # udevadm_pid=927369 00:03:40.082 08:15:53 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:40.082 08:15:53 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:03:40.082 08:15:53 -- pm/common@17 -- # local monitor 00:03:40.082 08:15:53 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:40.082 08:15:53 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:40.083 08:15:53 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:40.083 08:15:53 -- pm/common@21 -- # date +%s 00:03:40.083 08:15:53 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:40.083 08:15:53 -- pm/common@21 -- # date +%s 00:03:40.083 08:15:53 -- pm/common@21 -- # date +%s 00:03:40.083 08:15:53 -- pm/common@25 -- # sleep 1 00:03:40.083 08:15:53 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1731827753 00:03:40.083 08:15:53 -- pm/common@21 -- # date +%s 00:03:40.083 08:15:53 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1731827753 00:03:40.083 08:15:53 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1731827753 00:03:40.083 08:15:53 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1731827753 00:03:40.083 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1731827753_collect-cpu-temp.pm.log 00:03:40.083 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1731827753_collect-cpu-load.pm.log 00:03:40.083 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1731827753_collect-vmstat.pm.log 00:03:40.083 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1731827753_collect-bmc-pm.bmc.pm.log 00:03:41.017 08:15:54 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:41.017 08:15:54 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:03:41.017 08:15:54 -- common/autotest_common.sh@724 -- # xtrace_disable 00:03:41.017 08:15:54 -- common/autotest_common.sh@10 -- # set +x 00:03:41.017 08:15:54 -- spdk/autotest.sh@59 -- # create_test_list 00:03:41.017 08:15:54 -- common/autotest_common.sh@748 -- # xtrace_disable 00:03:41.017 08:15:54 -- common/autotest_common.sh@10 -- # set +x 00:03:41.277 08:15:54 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:41.277 08:15:54 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:41.277 08:15:54 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:41.277 08:15:54 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:41.277 08:15:54 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:41.277 08:15:54 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:03:41.277 08:15:54 -- common/autotest_common.sh@1455 -- # uname 00:03:41.277 08:15:54 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:03:41.277 08:15:54 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:03:41.277 08:15:54 -- common/autotest_common.sh@1475 -- # uname 00:03:41.277 08:15:54 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:03:41.277 08:15:54 -- spdk/autotest.sh@68 -- # [[ y == y ]] 00:03:41.277 08:15:54 -- spdk/autotest.sh@70 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:03:41.277 lcov: LCOV version 1.15 00:03:41.277 08:15:54 -- spdk/autotest.sh@72 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:03:46.541 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcno 00:03:51.805 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:57.072 08:16:09 -- spdk/autotest.sh@76 -- # timing_enter pre_cleanup 00:03:57.072 08:16:09 -- common/autotest_common.sh@724 -- # xtrace_disable 00:03:57.072 08:16:09 -- common/autotest_common.sh@10 -- # set +x 00:03:57.072 08:16:09 -- spdk/autotest.sh@78 -- # rm -f 00:03:57.072 08:16:09 -- spdk/autotest.sh@81 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:00.351 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:04:00.351 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:04:00.351 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:04:00.351 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:04:00.351 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:04:00.351 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:04:00.351 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:04:00.351 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:04:00.351 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:04:00.351 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:04:00.351 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:04:00.351 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:04:00.351 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:04:00.351 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:04:00.351 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:04:00.351 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:04:00.351 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:04:00.351 08:16:13 -- spdk/autotest.sh@83 -- # get_zoned_devs 00:04:00.351 08:16:13 -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:04:00.351 08:16:13 -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:04:00.351 08:16:13 -- common/autotest_common.sh@1656 -- # local nvme bdf 00:04:00.351 08:16:13 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:04:00.351 08:16:13 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:04:00.351 08:16:13 -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:04:00.351 08:16:13 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:00.351 08:16:13 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:04:00.351 08:16:13 -- spdk/autotest.sh@85 -- # (( 0 > 0 )) 00:04:00.351 08:16:13 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:04:00.351 08:16:13 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:04:00.351 08:16:13 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme0n1 00:04:00.351 08:16:13 -- scripts/common.sh@381 -- # local block=/dev/nvme0n1 pt 00:04:00.351 08:16:13 -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:00.351 No valid GPT data, bailing 00:04:00.351 08:16:13 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:00.351 08:16:13 -- scripts/common.sh@394 -- # pt= 00:04:00.351 08:16:13 -- scripts/common.sh@395 -- # return 1 00:04:00.351 08:16:13 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:00.351 1+0 records in 00:04:00.351 1+0 records out 00:04:00.351 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00139104 s, 754 MB/s 00:04:00.351 08:16:13 -- spdk/autotest.sh@105 -- # sync 00:04:00.351 08:16:13 -- spdk/autotest.sh@107 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:00.351 08:16:13 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:00.351 08:16:13 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:06.903 08:16:19 -- spdk/autotest.sh@111 -- # uname -s 00:04:06.903 08:16:19 -- spdk/autotest.sh@111 -- # [[ Linux == Linux ]] 00:04:06.903 08:16:19 -- spdk/autotest.sh@111 -- # [[ 1 -eq 1 ]] 00:04:06.903 08:16:19 -- spdk/autotest.sh@112 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:06.903 08:16:19 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:06.903 08:16:19 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:06.903 08:16:19 -- common/autotest_common.sh@10 -- # set +x 00:04:06.903 ************************************ 00:04:06.903 START TEST setup.sh 00:04:06.903 ************************************ 00:04:06.903 08:16:19 setup.sh -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:06.903 * Looking for test storage... 00:04:06.903 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:06.903 08:16:19 setup.sh -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:06.903 08:16:19 setup.sh -- common/autotest_common.sh@1681 -- # lcov --version 00:04:06.903 08:16:19 setup.sh -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:06.903 08:16:19 setup.sh -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@336 -- # IFS=.-: 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@336 -- # read -ra ver1 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@337 -- # IFS=.-: 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@337 -- # read -ra ver2 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@338 -- # local 'op=<' 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@340 -- # ver1_l=2 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@341 -- # ver2_l=1 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@344 -- # case "$op" in 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@345 -- # : 1 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@365 -- # decimal 1 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@353 -- # local d=1 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@355 -- # echo 1 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@365 -- # ver1[v]=1 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@366 -- # decimal 2 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@353 -- # local d=2 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@355 -- # echo 2 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@366 -- # ver2[v]=2 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:06.903 08:16:19 setup.sh -- scripts/common.sh@368 -- # return 0 00:04:06.903 08:16:19 setup.sh -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:06.903 08:16:19 setup.sh -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:06.903 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:06.903 --rc genhtml_branch_coverage=1 00:04:06.903 --rc genhtml_function_coverage=1 00:04:06.903 --rc genhtml_legend=1 00:04:06.903 --rc geninfo_all_blocks=1 00:04:06.903 --rc geninfo_unexecuted_blocks=1 00:04:06.903 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:06.903 ' 00:04:06.903 08:16:19 setup.sh -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:06.903 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:06.903 --rc genhtml_branch_coverage=1 00:04:06.903 --rc genhtml_function_coverage=1 00:04:06.903 --rc genhtml_legend=1 00:04:06.903 --rc geninfo_all_blocks=1 00:04:06.903 --rc geninfo_unexecuted_blocks=1 00:04:06.903 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:06.903 ' 00:04:06.903 08:16:19 setup.sh -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:06.903 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:06.903 --rc genhtml_branch_coverage=1 00:04:06.903 --rc genhtml_function_coverage=1 00:04:06.903 --rc genhtml_legend=1 00:04:06.903 --rc geninfo_all_blocks=1 00:04:06.903 --rc geninfo_unexecuted_blocks=1 00:04:06.903 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:06.903 ' 00:04:06.903 08:16:19 setup.sh -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:06.903 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:06.903 --rc genhtml_branch_coverage=1 00:04:06.903 --rc genhtml_function_coverage=1 00:04:06.903 --rc genhtml_legend=1 00:04:06.903 --rc geninfo_all_blocks=1 00:04:06.903 --rc geninfo_unexecuted_blocks=1 00:04:06.903 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:06.903 ' 00:04:06.903 08:16:19 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:06.903 08:16:19 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:06.903 08:16:19 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:06.903 08:16:19 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:06.903 08:16:19 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:06.903 08:16:19 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:06.903 ************************************ 00:04:06.903 START TEST acl 00:04:06.903 ************************************ 00:04:06.903 08:16:19 setup.sh.acl -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:06.903 * Looking for test storage... 00:04:06.903 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:06.903 08:16:19 setup.sh.acl -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:06.903 08:16:19 setup.sh.acl -- common/autotest_common.sh@1681 -- # lcov --version 00:04:06.903 08:16:19 setup.sh.acl -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:07.163 08:16:20 setup.sh.acl -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@336 -- # IFS=.-: 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@336 -- # read -ra ver1 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@337 -- # IFS=.-: 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@337 -- # read -ra ver2 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@338 -- # local 'op=<' 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@340 -- # ver1_l=2 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@341 -- # ver2_l=1 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@344 -- # case "$op" in 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@345 -- # : 1 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@365 -- # decimal 1 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@353 -- # local d=1 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@355 -- # echo 1 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@365 -- # ver1[v]=1 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@366 -- # decimal 2 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@353 -- # local d=2 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@355 -- # echo 2 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@366 -- # ver2[v]=2 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:07.163 08:16:20 setup.sh.acl -- scripts/common.sh@368 -- # return 0 00:04:07.163 08:16:20 setup.sh.acl -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:07.163 08:16:20 setup.sh.acl -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:07.163 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:07.163 --rc genhtml_branch_coverage=1 00:04:07.163 --rc genhtml_function_coverage=1 00:04:07.163 --rc genhtml_legend=1 00:04:07.163 --rc geninfo_all_blocks=1 00:04:07.163 --rc geninfo_unexecuted_blocks=1 00:04:07.163 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:07.163 ' 00:04:07.163 08:16:20 setup.sh.acl -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:07.163 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:07.163 --rc genhtml_branch_coverage=1 00:04:07.163 --rc genhtml_function_coverage=1 00:04:07.163 --rc genhtml_legend=1 00:04:07.163 --rc geninfo_all_blocks=1 00:04:07.163 --rc geninfo_unexecuted_blocks=1 00:04:07.163 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:07.163 ' 00:04:07.163 08:16:20 setup.sh.acl -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:07.163 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:07.163 --rc genhtml_branch_coverage=1 00:04:07.163 --rc genhtml_function_coverage=1 00:04:07.163 --rc genhtml_legend=1 00:04:07.163 --rc geninfo_all_blocks=1 00:04:07.163 --rc geninfo_unexecuted_blocks=1 00:04:07.163 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:07.163 ' 00:04:07.163 08:16:20 setup.sh.acl -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:07.163 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:07.163 --rc genhtml_branch_coverage=1 00:04:07.163 --rc genhtml_function_coverage=1 00:04:07.163 --rc genhtml_legend=1 00:04:07.163 --rc geninfo_all_blocks=1 00:04:07.163 --rc geninfo_unexecuted_blocks=1 00:04:07.163 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:07.163 ' 00:04:07.163 08:16:20 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:07.163 08:16:20 setup.sh.acl -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:04:07.163 08:16:20 setup.sh.acl -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:04:07.163 08:16:20 setup.sh.acl -- common/autotest_common.sh@1656 -- # local nvme bdf 00:04:07.163 08:16:20 setup.sh.acl -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:04:07.163 08:16:20 setup.sh.acl -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:04:07.163 08:16:20 setup.sh.acl -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:04:07.163 08:16:20 setup.sh.acl -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:07.163 08:16:20 setup.sh.acl -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:04:07.163 08:16:20 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:07.163 08:16:20 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:07.163 08:16:20 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:07.163 08:16:20 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:07.163 08:16:20 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:07.163 08:16:20 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:07.163 08:16:20 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:11.351 08:16:23 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:11.351 08:16:23 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:11.351 08:16:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:11.351 08:16:23 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:11.351 08:16:23 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:11.351 08:16:23 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:13.952 Hugepages 00:04:13.952 node hugesize free / total 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.952 00:04:13.952 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.952 08:16:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.952 08:16:27 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:14.271 08:16:27 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:04:14.271 08:16:27 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:14.271 08:16:27 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:14.271 08:16:27 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:14.271 08:16:27 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:14.271 08:16:27 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:14.271 08:16:27 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:14.271 08:16:27 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:14.271 08:16:27 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:14.271 08:16:27 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:14.271 08:16:27 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:14.271 ************************************ 00:04:14.271 START TEST denied 00:04:14.271 ************************************ 00:04:14.271 08:16:27 setup.sh.acl.denied -- common/autotest_common.sh@1125 -- # denied 00:04:14.271 08:16:27 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:04:14.271 08:16:27 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:14.271 08:16:27 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:04:14.271 08:16:27 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:14.271 08:16:27 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:17.591 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:04:17.591 08:16:30 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:04:17.591 08:16:30 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:17.591 08:16:30 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:17.591 08:16:30 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:04:17.591 08:16:30 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:04:17.591 08:16:30 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:17.591 08:16:30 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:17.591 08:16:30 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:17.592 08:16:30 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:17.592 08:16:30 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:21.776 00:04:21.776 real 0m7.482s 00:04:21.776 user 0m2.230s 00:04:21.776 sys 0m4.533s 00:04:21.776 08:16:34 setup.sh.acl.denied -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:21.776 08:16:34 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:21.776 ************************************ 00:04:21.776 END TEST denied 00:04:21.776 ************************************ 00:04:21.776 08:16:34 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:21.776 08:16:34 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:21.776 08:16:34 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:21.776 08:16:34 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:21.776 ************************************ 00:04:21.776 START TEST allowed 00:04:21.776 ************************************ 00:04:21.776 08:16:34 setup.sh.acl.allowed -- common/autotest_common.sh@1125 -- # allowed 00:04:21.776 08:16:34 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:04:21.776 08:16:34 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:04:21.776 08:16:34 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:21.776 08:16:34 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:21.776 08:16:34 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:27.049 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:27.049 08:16:39 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:27.049 08:16:39 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:27.049 08:16:39 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:27.049 08:16:39 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:27.049 08:16:39 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:31.248 00:04:31.248 real 0m8.909s 00:04:31.248 user 0m2.614s 00:04:31.248 sys 0m4.902s 00:04:31.248 08:16:43 setup.sh.acl.allowed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:31.248 08:16:43 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:31.248 ************************************ 00:04:31.248 END TEST allowed 00:04:31.248 ************************************ 00:04:31.248 00:04:31.248 real 0m23.846s 00:04:31.248 user 0m7.455s 00:04:31.248 sys 0m14.540s 00:04:31.248 08:16:43 setup.sh.acl -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:31.248 08:16:43 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:31.248 ************************************ 00:04:31.248 END TEST acl 00:04:31.248 ************************************ 00:04:31.248 08:16:43 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:31.248 08:16:43 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:31.248 08:16:43 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:31.248 08:16:43 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:31.248 ************************************ 00:04:31.248 START TEST hugepages 00:04:31.248 ************************************ 00:04:31.248 08:16:43 setup.sh.hugepages -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:31.248 * Looking for test storage... 00:04:31.248 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:31.248 08:16:43 setup.sh.hugepages -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:31.248 08:16:43 setup.sh.hugepages -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:31.248 08:16:43 setup.sh.hugepages -- common/autotest_common.sh@1681 -- # lcov --version 00:04:31.248 08:16:43 setup.sh.hugepages -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@336 -- # IFS=.-: 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@336 -- # read -ra ver1 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@337 -- # IFS=.-: 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@337 -- # read -ra ver2 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@338 -- # local 'op=<' 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@340 -- # ver1_l=2 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@341 -- # ver2_l=1 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@344 -- # case "$op" in 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@345 -- # : 1 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@365 -- # decimal 1 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=1 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 1 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@365 -- # ver1[v]=1 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@366 -- # decimal 2 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=2 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 2 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@366 -- # ver2[v]=2 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:31.248 08:16:43 setup.sh.hugepages -- scripts/common.sh@368 -- # return 0 00:04:31.248 08:16:43 setup.sh.hugepages -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:31.248 08:16:43 setup.sh.hugepages -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:31.248 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:31.248 --rc genhtml_branch_coverage=1 00:04:31.248 --rc genhtml_function_coverage=1 00:04:31.248 --rc genhtml_legend=1 00:04:31.248 --rc geninfo_all_blocks=1 00:04:31.248 --rc geninfo_unexecuted_blocks=1 00:04:31.248 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:31.248 ' 00:04:31.248 08:16:43 setup.sh.hugepages -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:31.248 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:31.248 --rc genhtml_branch_coverage=1 00:04:31.248 --rc genhtml_function_coverage=1 00:04:31.248 --rc genhtml_legend=1 00:04:31.248 --rc geninfo_all_blocks=1 00:04:31.248 --rc geninfo_unexecuted_blocks=1 00:04:31.248 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:31.248 ' 00:04:31.248 08:16:43 setup.sh.hugepages -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:31.248 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:31.248 --rc genhtml_branch_coverage=1 00:04:31.248 --rc genhtml_function_coverage=1 00:04:31.248 --rc genhtml_legend=1 00:04:31.248 --rc geninfo_all_blocks=1 00:04:31.248 --rc geninfo_unexecuted_blocks=1 00:04:31.248 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:31.248 ' 00:04:31.248 08:16:43 setup.sh.hugepages -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:31.248 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:31.248 --rc genhtml_branch_coverage=1 00:04:31.248 --rc genhtml_function_coverage=1 00:04:31.248 --rc genhtml_legend=1 00:04:31.248 --rc geninfo_all_blocks=1 00:04:31.248 --rc geninfo_unexecuted_blocks=1 00:04:31.248 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:31.248 ' 00:04:31.248 08:16:43 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:31.248 08:16:43 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:31.248 08:16:43 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:31.249 08:16:43 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:31.249 08:16:43 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:31.249 08:16:43 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:31.249 08:16:43 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:31.249 08:16:43 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:31.249 08:16:43 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:31.249 08:16:43 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:31.249 08:16:43 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.249 08:16:43 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 38809508 kB' 'MemAvailable: 42538508 kB' 'Buffers: 8940 kB' 'Cached: 13218364 kB' 'SwapCached: 0 kB' 'Active: 10043344 kB' 'Inactive: 3688272 kB' 'Active(anon): 9626904 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 507752 kB' 'Mapped: 160528 kB' 'Shmem: 9122592 kB' 'KReclaimable: 242308 kB' 'Slab: 936256 kB' 'SReclaimable: 242308 kB' 'SUnreclaim: 693948 kB' 'KernelStack: 21824 kB' 'PageTables: 7796 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433344 kB' 'Committed_AS: 10831920 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214112 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.249 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGEMEM 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGENODE 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v NRHUGE 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@197 -- # get_nodes 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@26 -- # local node 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@198 -- # clear_hp 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:04:31.250 08:16:44 setup.sh.hugepages -- setup/hugepages.sh@200 -- # run_test single_node_setup single_node_setup 00:04:31.250 08:16:44 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:31.250 08:16:44 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:31.250 08:16:44 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:31.250 ************************************ 00:04:31.250 START TEST single_node_setup 00:04:31.250 ************************************ 00:04:31.250 08:16:44 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1125 -- # single_node_setup 00:04:31.250 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@135 -- # get_test_nr_hugepages 2097152 0 00:04:31.250 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@48 -- # local size=2097152 00:04:31.250 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:04:31.250 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@50 -- # shift 00:04:31.250 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # node_ids=('0') 00:04:31.250 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # local node_ids 00:04:31.251 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:31.251 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:04:31.251 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:04:31.251 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:04:31.251 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # local user_nodes 00:04:31.251 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:04:31.251 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:31.251 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:31.251 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:31.251 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:04:31.251 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:04:31.251 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:04:31.251 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@72 -- # return 0 00:04:31.251 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # NRHUGE=1024 00:04:31.251 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # HUGENODE=0 00:04:31.251 08:16:44 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # setup output 00:04:31.251 08:16:44 setup.sh.hugepages.single_node_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:31.251 08:16:44 setup.sh.hugepages.single_node_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:34.541 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:34.541 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:34.541 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:34.541 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:34.541 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:34.541 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:34.541 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:34.541 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:34.541 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:34.541 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:34.541 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:34.541 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:34.541 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:34.541 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:34.541 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:34.541 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:35.923 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@137 -- # verify_nr_hugepages 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@88 -- # local node 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@89 -- # local sorted_t 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@90 -- # local sorted_s 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@91 -- # local surp 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@92 -- # local resv 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@93 -- # local anon 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 40960108 kB' 'MemAvailable: 44688624 kB' 'Buffers: 8940 kB' 'Cached: 13218504 kB' 'SwapCached: 0 kB' 'Active: 10045772 kB' 'Inactive: 3688272 kB' 'Active(anon): 9629332 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510092 kB' 'Mapped: 160644 kB' 'Shmem: 9122732 kB' 'KReclaimable: 241340 kB' 'Slab: 934584 kB' 'SReclaimable: 241340 kB' 'SUnreclaim: 693244 kB' 'KernelStack: 21824 kB' 'PageTables: 7580 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10832660 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.923 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.924 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # anon=0 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:35.925 08:16:48 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 40970936 kB' 'MemAvailable: 44699452 kB' 'Buffers: 8940 kB' 'Cached: 13218504 kB' 'SwapCached: 0 kB' 'Active: 10045540 kB' 'Inactive: 3688272 kB' 'Active(anon): 9629100 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509172 kB' 'Mapped: 160568 kB' 'Shmem: 9122732 kB' 'KReclaimable: 241340 kB' 'Slab: 934228 kB' 'SReclaimable: 241340 kB' 'SUnreclaim: 692888 kB' 'KernelStack: 21968 kB' 'PageTables: 7792 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10832676 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.925 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.926 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # surp=0 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:35.927 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 40974460 kB' 'MemAvailable: 44702976 kB' 'Buffers: 8940 kB' 'Cached: 13218532 kB' 'SwapCached: 0 kB' 'Active: 10045840 kB' 'Inactive: 3688272 kB' 'Active(anon): 9629400 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509904 kB' 'Mapped: 160568 kB' 'Shmem: 9122760 kB' 'KReclaimable: 241340 kB' 'Slab: 934228 kB' 'SReclaimable: 241340 kB' 'SUnreclaim: 692888 kB' 'KernelStack: 22048 kB' 'PageTables: 8052 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10832700 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214384 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.928 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.929 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.930 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.930 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.930 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.930 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.930 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.930 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.930 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.930 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:35.930 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:35.930 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:35.930 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:35.930 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # resv=0 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:04:36.191 nr_hugepages=1024 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:04:36.191 resv_hugepages=0 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:04:36.191 surplus_hugepages=0 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:04:36.191 anon_hugepages=0 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 40971040 kB' 'MemAvailable: 44699556 kB' 'Buffers: 8940 kB' 'Cached: 13218532 kB' 'SwapCached: 0 kB' 'Active: 10046504 kB' 'Inactive: 3688272 kB' 'Active(anon): 9630064 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511108 kB' 'Mapped: 160568 kB' 'Shmem: 9122760 kB' 'KReclaimable: 241340 kB' 'Slab: 934228 kB' 'SReclaimable: 241340 kB' 'SUnreclaim: 692888 kB' 'KernelStack: 21952 kB' 'PageTables: 7972 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10832352 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214384 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.191 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 1024 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@111 -- # get_nodes 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@26 -- # local node 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node=0 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 25471532 kB' 'MemUsed: 7113836 kB' 'SwapCached: 0 kB' 'Active: 3925944 kB' 'Inactive: 201432 kB' 'Active(anon): 3734476 kB' 'Inactive(anon): 0 kB' 'Active(file): 191468 kB' 'Inactive(file): 201432 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3764232 kB' 'Mapped: 70408 kB' 'AnonPages: 366308 kB' 'Shmem: 3371332 kB' 'KernelStack: 12632 kB' 'PageTables: 4392 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 107972 kB' 'Slab: 402440 kB' 'SReclaimable: 107972 kB' 'SUnreclaim: 294468 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.192 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:04:36.193 node0=1024 expecting 1024 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:04:36.193 00:04:36.193 real 0m5.036s 00:04:36.193 user 0m1.217s 00:04:36.193 sys 0m2.255s 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:36.193 08:16:49 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@10 -- # set +x 00:04:36.193 ************************************ 00:04:36.193 END TEST single_node_setup 00:04:36.193 ************************************ 00:04:36.193 08:16:49 setup.sh.hugepages -- setup/hugepages.sh@201 -- # run_test even_2G_alloc even_2G_alloc 00:04:36.193 08:16:49 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:36.193 08:16:49 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:36.193 08:16:49 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:36.193 ************************************ 00:04:36.193 START TEST even_2G_alloc 00:04:36.193 ************************************ 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1125 -- # even_2G_alloc 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@142 -- # get_test_nr_hugepages 2097152 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 512 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 1 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 0 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # NRHUGE=1024 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # setup output 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:36.193 08:16:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:39.479 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:39.479 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:39.479 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:39.479 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:39.479 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:39.479 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:39.479 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:39.479 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:39.479 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:39.479 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:39.479 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:39.479 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:39.479 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:39.479 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:39.479 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:39.479 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:39.479 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@144 -- # verify_nr_hugepages 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@88 -- # local node 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local surp 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local resv 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local anon 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 40977692 kB' 'MemAvailable: 44706196 kB' 'Buffers: 8940 kB' 'Cached: 13218660 kB' 'SwapCached: 0 kB' 'Active: 10044956 kB' 'Inactive: 3688272 kB' 'Active(anon): 9628516 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 508832 kB' 'Mapped: 159660 kB' 'Shmem: 9122888 kB' 'KReclaimable: 241316 kB' 'Slab: 934200 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 692884 kB' 'KernelStack: 21904 kB' 'PageTables: 7464 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10827540 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214464 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.479 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # anon=0 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.480 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 40979300 kB' 'MemAvailable: 44707804 kB' 'Buffers: 8940 kB' 'Cached: 13218664 kB' 'SwapCached: 0 kB' 'Active: 10045112 kB' 'Inactive: 3688272 kB' 'Active(anon): 9628672 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509000 kB' 'Mapped: 159556 kB' 'Shmem: 9122892 kB' 'KReclaimable: 241316 kB' 'Slab: 934084 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 692768 kB' 'KernelStack: 21952 kB' 'PageTables: 7668 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10826144 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214464 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.481 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # surp=0 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 40978068 kB' 'MemAvailable: 44706572 kB' 'Buffers: 8940 kB' 'Cached: 13218680 kB' 'SwapCached: 0 kB' 'Active: 10045072 kB' 'Inactive: 3688272 kB' 'Active(anon): 9628632 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 508920 kB' 'Mapped: 159556 kB' 'Shmem: 9122908 kB' 'KReclaimable: 241316 kB' 'Slab: 934088 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 692772 kB' 'KernelStack: 21888 kB' 'PageTables: 7568 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10826164 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214448 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.483 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # resv=0 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:04:39.484 nr_hugepages=1024 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:04:39.484 resv_hugepages=0 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:04:39.484 surplus_hugepages=0 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:04:39.484 anon_hugepages=0 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 40976164 kB' 'MemAvailable: 44704668 kB' 'Buffers: 8940 kB' 'Cached: 13218704 kB' 'SwapCached: 0 kB' 'Active: 10046556 kB' 'Inactive: 3688272 kB' 'Active(anon): 9630116 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510440 kB' 'Mapped: 159556 kB' 'Shmem: 9122932 kB' 'KReclaimable: 241316 kB' 'Slab: 934088 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 692772 kB' 'KernelStack: 21904 kB' 'PageTables: 7512 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10838368 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214448 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.484 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@26 -- # local node 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:04:39.485 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 26516768 kB' 'MemUsed: 6068600 kB' 'SwapCached: 0 kB' 'Active: 3925064 kB' 'Inactive: 201432 kB' 'Active(anon): 3733596 kB' 'Inactive(anon): 0 kB' 'Active(file): 191468 kB' 'Inactive(file): 201432 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3764248 kB' 'Mapped: 70156 kB' 'AnonPages: 365336 kB' 'Shmem: 3371348 kB' 'KernelStack: 12728 kB' 'PageTables: 4388 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 107956 kB' 'Slab: 402260 kB' 'SReclaimable: 107956 kB' 'SUnreclaim: 294304 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.486 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698416 kB' 'MemFree: 14463352 kB' 'MemUsed: 13235064 kB' 'SwapCached: 0 kB' 'Active: 6119884 kB' 'Inactive: 3486840 kB' 'Active(anon): 5894912 kB' 'Inactive(anon): 0 kB' 'Active(file): 224972 kB' 'Inactive(file): 3486840 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9463436 kB' 'Mapped: 89380 kB' 'AnonPages: 143444 kB' 'Shmem: 5751624 kB' 'KernelStack: 9224 kB' 'PageTables: 3000 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 133360 kB' 'Slab: 531796 kB' 'SReclaimable: 133360 kB' 'SUnreclaim: 398436 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.487 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:04:39.488 node0=512 expecting 512 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:04:39.488 node1=512 expecting 512 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@129 -- # [[ 512 == \5\1\2 ]] 00:04:39.488 00:04:39.488 real 0m3.294s 00:04:39.488 user 0m1.210s 00:04:39.488 sys 0m2.117s 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:39.488 08:16:52 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:39.488 ************************************ 00:04:39.488 END TEST even_2G_alloc 00:04:39.488 ************************************ 00:04:39.488 08:16:52 setup.sh.hugepages -- setup/hugepages.sh@202 -- # run_test odd_alloc odd_alloc 00:04:39.488 08:16:52 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:39.488 08:16:52 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:39.488 08:16:52 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:39.488 ************************************ 00:04:39.488 START TEST odd_alloc 00:04:39.488 ************************************ 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1125 -- # odd_alloc 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@149 -- # get_test_nr_hugepages 2098176 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@48 -- # local size=2098176 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1025 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1025 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 513 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 1 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=513 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 0 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # HUGEMEM=2049 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # setup output 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:39.488 08:16:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:43.678 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:43.678 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:43.678 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:43.678 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:43.678 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:43.678 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:43.678 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:43.678 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:43.678 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:43.678 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:43.678 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:43.678 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:43.678 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:43.678 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:43.678 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:43.678 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:43.678 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:43.678 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@151 -- # verify_nr_hugepages 00:04:43.678 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@88 -- # local node 00:04:43.678 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:04:43.678 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local surp 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local resv 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local anon 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 40993628 kB' 'MemAvailable: 44722132 kB' 'Buffers: 8940 kB' 'Cached: 13218844 kB' 'SwapCached: 0 kB' 'Active: 10045408 kB' 'Inactive: 3688272 kB' 'Active(anon): 9628968 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509256 kB' 'Mapped: 159552 kB' 'Shmem: 9123072 kB' 'KReclaimable: 241316 kB' 'Slab: 933064 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 691748 kB' 'KernelStack: 21776 kB' 'PageTables: 7500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480896 kB' 'Committed_AS: 10825464 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.679 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # anon=0 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 40995180 kB' 'MemAvailable: 44723684 kB' 'Buffers: 8940 kB' 'Cached: 13218848 kB' 'SwapCached: 0 kB' 'Active: 10049024 kB' 'Inactive: 3688272 kB' 'Active(anon): 9632584 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512888 kB' 'Mapped: 160324 kB' 'Shmem: 9123076 kB' 'KReclaimable: 241316 kB' 'Slab: 933148 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 691832 kB' 'KernelStack: 21760 kB' 'PageTables: 7488 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480896 kB' 'Committed_AS: 10828464 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.680 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.681 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # surp=0 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 40999944 kB' 'MemAvailable: 44728448 kB' 'Buffers: 8940 kB' 'Cached: 13218868 kB' 'SwapCached: 0 kB' 'Active: 10046996 kB' 'Inactive: 3688272 kB' 'Active(anon): 9630556 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510816 kB' 'Mapped: 160468 kB' 'Shmem: 9123096 kB' 'KReclaimable: 241316 kB' 'Slab: 933148 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 691832 kB' 'KernelStack: 21744 kB' 'PageTables: 7448 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480896 kB' 'Committed_AS: 10826244 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.682 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.683 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # resv=0 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1025 00:04:43.684 nr_hugepages=1025 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:04:43.684 resv_hugepages=0 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:04:43.684 surplus_hugepages=0 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:04:43.684 anon_hugepages=0 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@106 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@108 -- # (( 1025 == nr_hugepages )) 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41003264 kB' 'MemAvailable: 44731768 kB' 'Buffers: 8940 kB' 'Cached: 13218888 kB' 'SwapCached: 0 kB' 'Active: 10046088 kB' 'Inactive: 3688272 kB' 'Active(anon): 9629648 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509964 kB' 'Mapped: 159884 kB' 'Shmem: 9123116 kB' 'KReclaimable: 241316 kB' 'Slab: 933148 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 691832 kB' 'KernelStack: 21776 kB' 'PageTables: 7532 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480896 kB' 'Committed_AS: 10824512 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.684 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.685 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@26 -- # local node 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=513 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 26521660 kB' 'MemUsed: 6063708 kB' 'SwapCached: 0 kB' 'Active: 3927256 kB' 'Inactive: 201432 kB' 'Active(anon): 3735788 kB' 'Inactive(anon): 0 kB' 'Active(file): 191468 kB' 'Inactive(file): 201432 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3764316 kB' 'Mapped: 70172 kB' 'AnonPages: 367676 kB' 'Shmem: 3371416 kB' 'KernelStack: 12536 kB' 'PageTables: 4456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 107956 kB' 'Slab: 401480 kB' 'SReclaimable: 107956 kB' 'SUnreclaim: 293524 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.686 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:04:43.687 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698416 kB' 'MemFree: 14482000 kB' 'MemUsed: 13216416 kB' 'SwapCached: 0 kB' 'Active: 6118704 kB' 'Inactive: 3486840 kB' 'Active(anon): 5893732 kB' 'Inactive(anon): 0 kB' 'Active(file): 224972 kB' 'Inactive(file): 3486840 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9463556 kB' 'Mapped: 89380 kB' 'AnonPages: 142068 kB' 'Shmem: 5751744 kB' 'KernelStack: 9224 kB' 'PageTables: 3004 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 133360 kB' 'Slab: 531668 kB' 'SReclaimable: 133360 kB' 'SUnreclaim: 398308 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.688 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node0=513 expecting 513' 00:04:43.689 node0=513 expecting 513 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:04:43.689 node1=512 expecting 512 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@129 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:43.689 00:04:43.689 real 0m3.717s 00:04:43.689 user 0m1.450s 00:04:43.689 sys 0m2.330s 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:43.689 08:16:56 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:43.689 ************************************ 00:04:43.689 END TEST odd_alloc 00:04:43.689 ************************************ 00:04:43.689 08:16:56 setup.sh.hugepages -- setup/hugepages.sh@203 -- # run_test custom_alloc custom_alloc 00:04:43.689 08:16:56 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:43.689 08:16:56 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:43.689 08:16:56 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:43.689 ************************************ 00:04:43.689 START TEST custom_alloc 00:04:43.689 ************************************ 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1125 -- # custom_alloc 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@157 -- # local IFS=, 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@159 -- # local node 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # nodes_hp=() 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # local nodes_hp 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@162 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@164 -- # get_test_nr_hugepages 1048576 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=1048576 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=512 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=512 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 256 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 1 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 0 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@165 -- # nodes_hp[0]=512 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@166 -- # (( 2 > 1 )) 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # get_test_nr_hugepages 2097152 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:43.689 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 1 > 0 )) 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@168 -- # nodes_hp[1]=1024 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # get_test_nr_hugepages_per_node 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 2 > 0 )) 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=1024 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # setup output 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:43.690 08:16:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:46.982 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:46.982 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:46.982 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:46.982 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:46.982 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:46.982 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:46.982 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:46.982 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:46.982 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:46.982 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:46.982 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:46.982 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:46.982 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:46.982 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:46.982 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:46.982 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:46.982 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nr_hugepages=1536 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # verify_nr_hugepages 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@88 -- # local node 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local surp 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local resv 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local anon 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 39962864 kB' 'MemAvailable: 43691368 kB' 'Buffers: 8940 kB' 'Cached: 13219020 kB' 'SwapCached: 0 kB' 'Active: 10053056 kB' 'Inactive: 3688272 kB' 'Active(anon): 9636616 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 516640 kB' 'Mapped: 160012 kB' 'Shmem: 9123248 kB' 'KReclaimable: 241316 kB' 'Slab: 934420 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 693104 kB' 'KernelStack: 21824 kB' 'PageTables: 7688 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957632 kB' 'Committed_AS: 10831260 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214308 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.982 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # anon=0 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.983 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 39963440 kB' 'MemAvailable: 43691944 kB' 'Buffers: 8940 kB' 'Cached: 13219024 kB' 'SwapCached: 0 kB' 'Active: 10047104 kB' 'Inactive: 3688272 kB' 'Active(anon): 9630664 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510620 kB' 'Mapped: 159508 kB' 'Shmem: 9123252 kB' 'KReclaimable: 241316 kB' 'Slab: 934400 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 693084 kB' 'KernelStack: 21792 kB' 'PageTables: 7560 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957632 kB' 'Committed_AS: 10825160 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.984 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # surp=0 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.985 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 39963148 kB' 'MemAvailable: 43691652 kB' 'Buffers: 8940 kB' 'Cached: 13219024 kB' 'SwapCached: 0 kB' 'Active: 10047500 kB' 'Inactive: 3688272 kB' 'Active(anon): 9631060 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511064 kB' 'Mapped: 159508 kB' 'Shmem: 9123252 kB' 'KReclaimable: 241316 kB' 'Slab: 934512 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 693196 kB' 'KernelStack: 21792 kB' 'PageTables: 7584 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957632 kB' 'Committed_AS: 10825180 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.986 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # resv=0 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1536 00:04:46.987 nr_hugepages=1536 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:04:46.987 resv_hugepages=0 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:04:46.987 surplus_hugepages=0 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:04:46.987 anon_hugepages=0 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@106 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@108 -- # (( 1536 == nr_hugepages )) 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:46.987 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 39962400 kB' 'MemAvailable: 43690904 kB' 'Buffers: 8940 kB' 'Cached: 13219060 kB' 'SwapCached: 0 kB' 'Active: 10047460 kB' 'Inactive: 3688272 kB' 'Active(anon): 9631020 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510984 kB' 'Mapped: 159508 kB' 'Shmem: 9123288 kB' 'KReclaimable: 241316 kB' 'Slab: 934512 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 693196 kB' 'KernelStack: 21808 kB' 'PageTables: 7632 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957632 kB' 'Committed_AS: 10826332 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.988 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@26 -- # local node 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.989 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 26517736 kB' 'MemUsed: 6067632 kB' 'SwapCached: 0 kB' 'Active: 3926372 kB' 'Inactive: 201432 kB' 'Active(anon): 3734904 kB' 'Inactive(anon): 0 kB' 'Active(file): 191468 kB' 'Inactive(file): 201432 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3764348 kB' 'Mapped: 70180 kB' 'AnonPages: 366596 kB' 'Shmem: 3371448 kB' 'KernelStack: 12520 kB' 'PageTables: 4488 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 107956 kB' 'Slab: 402596 kB' 'SReclaimable: 107956 kB' 'SUnreclaim: 294640 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.990 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698416 kB' 'MemFree: 13445060 kB' 'MemUsed: 14253356 kB' 'SwapCached: 0 kB' 'Active: 6121592 kB' 'Inactive: 3486840 kB' 'Active(anon): 5896620 kB' 'Inactive(anon): 0 kB' 'Active(file): 224972 kB' 'Inactive(file): 3486840 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9463672 kB' 'Mapped: 89348 kB' 'AnonPages: 144912 kB' 'Shmem: 5751860 kB' 'KernelStack: 9368 kB' 'PageTables: 3024 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 133360 kB' 'Slab: 531916 kB' 'SReclaimable: 133360 kB' 'SUnreclaim: 398556 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.991 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:46.992 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:46.993 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:46.993 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:46.993 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:04:46.993 node0=512 expecting 512 00:04:46.993 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:46.993 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:46.993 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:46.993 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node1=1024 expecting 1024' 00:04:46.993 node1=1024 expecting 1024 00:04:46.993 08:16:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@129 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:46.993 00:04:46.993 real 0m3.404s 00:04:46.993 user 0m1.289s 00:04:46.993 sys 0m2.146s 00:04:46.993 08:16:59 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:46.993 08:16:59 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:46.993 ************************************ 00:04:46.993 END TEST custom_alloc 00:04:46.993 ************************************ 00:04:46.993 08:16:59 setup.sh.hugepages -- setup/hugepages.sh@204 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:46.993 08:16:59 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:46.993 08:16:59 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:46.993 08:16:59 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:46.993 ************************************ 00:04:46.993 START TEST no_shrink_alloc 00:04:46.993 ************************************ 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1125 -- # no_shrink_alloc 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@185 -- # get_test_nr_hugepages 2097152 0 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # shift 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # node_ids=('0') 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # local node_ids 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@72 -- # return 0 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # NRHUGE=1024 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # HUGENODE=0 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # setup output 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:46.993 08:16:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:50.280 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:50.280 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:50.280 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:50.280 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:50.280 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:50.280 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:50.280 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:50.280 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:50.280 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:50.280 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:50.280 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:50.280 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:50.280 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:50.280 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:50.280 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:50.280 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:50.280 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:50.280 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@189 -- # verify_nr_hugepages 00:04:50.280 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:04:50.280 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:04:50.280 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:04:50.280 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:04:50.280 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:04:50.280 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41031760 kB' 'MemAvailable: 44760264 kB' 'Buffers: 8940 kB' 'Cached: 13219188 kB' 'SwapCached: 0 kB' 'Active: 10047592 kB' 'Inactive: 3688272 kB' 'Active(anon): 9631152 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510976 kB' 'Mapped: 159644 kB' 'Shmem: 9123416 kB' 'KReclaimable: 241316 kB' 'Slab: 933688 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 692372 kB' 'KernelStack: 21920 kB' 'PageTables: 7740 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10828448 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214496 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.281 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41032164 kB' 'MemAvailable: 44760668 kB' 'Buffers: 8940 kB' 'Cached: 13219188 kB' 'SwapCached: 0 kB' 'Active: 10046852 kB' 'Inactive: 3688272 kB' 'Active(anon): 9630412 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510260 kB' 'Mapped: 159592 kB' 'Shmem: 9123416 kB' 'KReclaimable: 241316 kB' 'Slab: 933688 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 692372 kB' 'KernelStack: 21792 kB' 'PageTables: 7288 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10828464 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.282 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.283 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41031044 kB' 'MemAvailable: 44759548 kB' 'Buffers: 8940 kB' 'Cached: 13219188 kB' 'SwapCached: 0 kB' 'Active: 10046892 kB' 'Inactive: 3688272 kB' 'Active(anon): 9630452 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510300 kB' 'Mapped: 159592 kB' 'Shmem: 9123416 kB' 'KReclaimable: 241316 kB' 'Slab: 933700 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 692384 kB' 'KernelStack: 21872 kB' 'PageTables: 7372 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10826988 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.284 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.285 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:04:50.286 nr_hugepages=1024 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:04:50.286 resv_hugepages=0 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:04:50.286 surplus_hugepages=0 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:04:50.286 anon_hugepages=0 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41029308 kB' 'MemAvailable: 44757812 kB' 'Buffers: 8940 kB' 'Cached: 13219232 kB' 'SwapCached: 0 kB' 'Active: 10046584 kB' 'Inactive: 3688272 kB' 'Active(anon): 9630144 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510024 kB' 'Mapped: 159572 kB' 'Shmem: 9123460 kB' 'KReclaimable: 241316 kB' 'Slab: 933828 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 692512 kB' 'KernelStack: 21760 kB' 'PageTables: 7460 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10825880 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.286 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.287 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 25456672 kB' 'MemUsed: 7128696 kB' 'SwapCached: 0 kB' 'Active: 3926884 kB' 'Inactive: 201432 kB' 'Active(anon): 3735416 kB' 'Inactive(anon): 0 kB' 'Active(file): 191468 kB' 'Inactive(file): 201432 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3764372 kB' 'Mapped: 70192 kB' 'AnonPages: 367200 kB' 'Shmem: 3371472 kB' 'KernelStack: 12504 kB' 'PageTables: 4416 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 107956 kB' 'Slab: 401424 kB' 'SReclaimable: 107956 kB' 'SUnreclaim: 293468 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.288 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.289 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:04:50.290 node0=1024 expecting 1024 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # CLEAR_HUGE=no 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # NRHUGE=512 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # HUGENODE=0 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # setup output 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:50.290 08:17:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:53.582 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:53.582 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:53.582 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:53.582 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:53.582 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:53.582 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:53.582 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:53.582 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:53.582 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:53.582 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:53.582 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:53.582 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:53.582 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:53.582 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:53.582 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:53.582 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:53.582 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:53.582 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@194 -- # verify_nr_hugepages 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 40990972 kB' 'MemAvailable: 44719476 kB' 'Buffers: 8940 kB' 'Cached: 13219340 kB' 'SwapCached: 0 kB' 'Active: 10048644 kB' 'Inactive: 3688272 kB' 'Active(anon): 9632204 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510988 kB' 'Mapped: 160088 kB' 'Shmem: 9123568 kB' 'KReclaimable: 241316 kB' 'Slab: 934016 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 692700 kB' 'KernelStack: 21712 kB' 'PageTables: 7376 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10826488 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.582 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.583 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 40991856 kB' 'MemAvailable: 44720360 kB' 'Buffers: 8940 kB' 'Cached: 13219344 kB' 'SwapCached: 0 kB' 'Active: 10047948 kB' 'Inactive: 3688272 kB' 'Active(anon): 9631508 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511344 kB' 'Mapped: 159576 kB' 'Shmem: 9123572 kB' 'KReclaimable: 241316 kB' 'Slab: 933852 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 692536 kB' 'KernelStack: 21760 kB' 'PageTables: 7464 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10826504 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.584 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.585 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.586 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 40991100 kB' 'MemAvailable: 44719604 kB' 'Buffers: 8940 kB' 'Cached: 13219344 kB' 'SwapCached: 0 kB' 'Active: 10047948 kB' 'Inactive: 3688272 kB' 'Active(anon): 9631508 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511344 kB' 'Mapped: 159576 kB' 'Shmem: 9123572 kB' 'KReclaimable: 241316 kB' 'Slab: 933852 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 692536 kB' 'KernelStack: 21760 kB' 'PageTables: 7464 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10826528 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.587 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.588 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:04:53.589 nr_hugepages=1024 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:04:53.589 resv_hugepages=0 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:04:53.589 surplus_hugepages=0 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:04:53.589 anon_hugepages=0 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 40990344 kB' 'MemAvailable: 44718848 kB' 'Buffers: 8940 kB' 'Cached: 13219384 kB' 'SwapCached: 0 kB' 'Active: 10047996 kB' 'Inactive: 3688272 kB' 'Active(anon): 9631556 kB' 'Inactive(anon): 0 kB' 'Active(file): 416440 kB' 'Inactive(file): 3688272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511348 kB' 'Mapped: 159576 kB' 'Shmem: 9123612 kB' 'KReclaimable: 241316 kB' 'Slab: 933852 kB' 'SReclaimable: 241316 kB' 'SUnreclaim: 692536 kB' 'KernelStack: 21760 kB' 'PageTables: 7464 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10826548 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 75264 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.589 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.590 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 25450152 kB' 'MemUsed: 7135216 kB' 'SwapCached: 0 kB' 'Active: 3927012 kB' 'Inactive: 201432 kB' 'Active(anon): 3735544 kB' 'Inactive(anon): 0 kB' 'Active(file): 191468 kB' 'Inactive(file): 201432 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3764404 kB' 'Mapped: 70196 kB' 'AnonPages: 367292 kB' 'Shmem: 3371504 kB' 'KernelStack: 12488 kB' 'PageTables: 4372 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 107956 kB' 'Slab: 401660 kB' 'SReclaimable: 107956 kB' 'SUnreclaim: 293704 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.591 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.592 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:04:53.593 node0=1024 expecting 1024 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:04:53.593 00:04:53.593 real 0m6.800s 00:04:53.593 user 0m2.556s 00:04:53.593 sys 0m4.331s 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:53.593 08:17:06 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:53.593 ************************************ 00:04:53.593 END TEST no_shrink_alloc 00:04:53.593 ************************************ 00:04:53.593 08:17:06 setup.sh.hugepages -- setup/hugepages.sh@206 -- # clear_hp 00:04:53.593 08:17:06 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:04:53.593 08:17:06 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:53.593 08:17:06 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:53.593 08:17:06 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:53.593 08:17:06 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:53.593 08:17:06 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:53.852 08:17:06 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:53.852 08:17:06 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:53.852 08:17:06 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:53.852 08:17:06 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:53.852 08:17:06 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:53.852 08:17:06 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:04:53.852 08:17:06 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:04:53.852 00:04:53.852 real 0m22.926s 00:04:53.852 user 0m8.020s 00:04:53.852 sys 0m13.600s 00:04:53.852 08:17:06 setup.sh.hugepages -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:53.852 08:17:06 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:53.852 ************************************ 00:04:53.852 END TEST hugepages 00:04:53.852 ************************************ 00:04:53.852 08:17:06 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:53.853 08:17:06 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:53.853 08:17:06 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:53.853 08:17:06 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:53.853 ************************************ 00:04:53.853 START TEST driver 00:04:53.853 ************************************ 00:04:53.853 08:17:06 setup.sh.driver -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:53.853 * Looking for test storage... 00:04:53.853 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:53.853 08:17:06 setup.sh.driver -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:53.853 08:17:06 setup.sh.driver -- common/autotest_common.sh@1681 -- # lcov --version 00:04:53.853 08:17:06 setup.sh.driver -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:53.853 08:17:06 setup.sh.driver -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@336 -- # IFS=.-: 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@336 -- # read -ra ver1 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@337 -- # IFS=.-: 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@337 -- # read -ra ver2 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@338 -- # local 'op=<' 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@340 -- # ver1_l=2 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@341 -- # ver2_l=1 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@344 -- # case "$op" in 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@345 -- # : 1 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@365 -- # decimal 1 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@353 -- # local d=1 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@355 -- # echo 1 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@365 -- # ver1[v]=1 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@366 -- # decimal 2 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@353 -- # local d=2 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@355 -- # echo 2 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@366 -- # ver2[v]=2 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:53.853 08:17:06 setup.sh.driver -- scripts/common.sh@368 -- # return 0 00:04:53.853 08:17:06 setup.sh.driver -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:53.853 08:17:06 setup.sh.driver -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:53.853 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:53.853 --rc genhtml_branch_coverage=1 00:04:53.853 --rc genhtml_function_coverage=1 00:04:53.853 --rc genhtml_legend=1 00:04:53.853 --rc geninfo_all_blocks=1 00:04:53.853 --rc geninfo_unexecuted_blocks=1 00:04:53.853 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:53.853 ' 00:04:53.853 08:17:06 setup.sh.driver -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:53.853 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:53.853 --rc genhtml_branch_coverage=1 00:04:53.853 --rc genhtml_function_coverage=1 00:04:53.853 --rc genhtml_legend=1 00:04:53.853 --rc geninfo_all_blocks=1 00:04:53.853 --rc geninfo_unexecuted_blocks=1 00:04:53.853 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:53.853 ' 00:04:53.853 08:17:06 setup.sh.driver -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:53.853 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:53.853 --rc genhtml_branch_coverage=1 00:04:53.853 --rc genhtml_function_coverage=1 00:04:53.853 --rc genhtml_legend=1 00:04:53.853 --rc geninfo_all_blocks=1 00:04:53.853 --rc geninfo_unexecuted_blocks=1 00:04:53.853 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:53.853 ' 00:04:53.853 08:17:06 setup.sh.driver -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:53.853 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:53.853 --rc genhtml_branch_coverage=1 00:04:53.853 --rc genhtml_function_coverage=1 00:04:53.853 --rc genhtml_legend=1 00:04:53.853 --rc geninfo_all_blocks=1 00:04:53.853 --rc geninfo_unexecuted_blocks=1 00:04:53.853 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:53.853 ' 00:04:54.112 08:17:06 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:04:54.112 08:17:06 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:54.112 08:17:06 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:59.383 08:17:11 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:59.383 08:17:11 setup.sh.driver -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:59.383 08:17:11 setup.sh.driver -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:59.383 08:17:11 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:59.383 ************************************ 00:04:59.383 START TEST guess_driver 00:04:59.383 ************************************ 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- common/autotest_common.sh@1125 -- # guess_driver 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:59.383 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:59.383 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:59.383 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:59.383 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:59.383 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:59.383 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:59.383 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:59.383 Looking for driver=vfio-pci 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:04:59.383 08:17:11 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:01.915 08:17:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:01.915 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:01.915 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:01.915 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:01.915 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:01.915 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:01.915 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:01.915 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:01.915 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:01.915 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:01.915 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:01.915 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:01.915 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.174 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.174 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.174 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.174 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.174 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.174 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.174 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.174 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.174 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.174 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.174 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.174 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.174 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.174 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.174 08:17:15 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:03.551 08:17:16 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:03.551 08:17:16 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:03.551 08:17:16 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:03.809 08:17:16 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:03.809 08:17:16 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:03.809 08:17:16 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:03.809 08:17:16 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:09.078 00:05:09.078 real 0m9.935s 00:05:09.078 user 0m2.673s 00:05:09.078 sys 0m5.024s 00:05:09.078 08:17:21 setup.sh.driver.guess_driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:09.078 08:17:21 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:09.078 ************************************ 00:05:09.078 END TEST guess_driver 00:05:09.078 ************************************ 00:05:09.078 00:05:09.078 real 0m14.792s 00:05:09.078 user 0m4.098s 00:05:09.078 sys 0m7.638s 00:05:09.078 08:17:21 setup.sh.driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:09.078 08:17:21 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:09.078 ************************************ 00:05:09.078 END TEST driver 00:05:09.078 ************************************ 00:05:09.078 08:17:21 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:09.078 08:17:21 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:09.078 08:17:21 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:09.078 08:17:21 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:09.078 ************************************ 00:05:09.078 START TEST devices 00:05:09.078 ************************************ 00:05:09.078 08:17:21 setup.sh.devices -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:09.078 * Looking for test storage... 00:05:09.078 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:09.078 08:17:21 setup.sh.devices -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:05:09.078 08:17:21 setup.sh.devices -- common/autotest_common.sh@1681 -- # lcov --version 00:05:09.078 08:17:21 setup.sh.devices -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:05:09.078 08:17:21 setup.sh.devices -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:05:09.078 08:17:21 setup.sh.devices -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:09.078 08:17:21 setup.sh.devices -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:09.078 08:17:21 setup.sh.devices -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:09.078 08:17:21 setup.sh.devices -- scripts/common.sh@336 -- # IFS=.-: 00:05:09.078 08:17:21 setup.sh.devices -- scripts/common.sh@336 -- # read -ra ver1 00:05:09.078 08:17:21 setup.sh.devices -- scripts/common.sh@337 -- # IFS=.-: 00:05:09.078 08:17:21 setup.sh.devices -- scripts/common.sh@337 -- # read -ra ver2 00:05:09.078 08:17:21 setup.sh.devices -- scripts/common.sh@338 -- # local 'op=<' 00:05:09.078 08:17:21 setup.sh.devices -- scripts/common.sh@340 -- # ver1_l=2 00:05:09.078 08:17:21 setup.sh.devices -- scripts/common.sh@341 -- # ver2_l=1 00:05:09.078 08:17:21 setup.sh.devices -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:09.078 08:17:21 setup.sh.devices -- scripts/common.sh@344 -- # case "$op" in 00:05:09.079 08:17:21 setup.sh.devices -- scripts/common.sh@345 -- # : 1 00:05:09.079 08:17:21 setup.sh.devices -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:09.079 08:17:21 setup.sh.devices -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:09.079 08:17:21 setup.sh.devices -- scripts/common.sh@365 -- # decimal 1 00:05:09.079 08:17:21 setup.sh.devices -- scripts/common.sh@353 -- # local d=1 00:05:09.079 08:17:21 setup.sh.devices -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:09.079 08:17:21 setup.sh.devices -- scripts/common.sh@355 -- # echo 1 00:05:09.079 08:17:21 setup.sh.devices -- scripts/common.sh@365 -- # ver1[v]=1 00:05:09.079 08:17:21 setup.sh.devices -- scripts/common.sh@366 -- # decimal 2 00:05:09.079 08:17:21 setup.sh.devices -- scripts/common.sh@353 -- # local d=2 00:05:09.079 08:17:21 setup.sh.devices -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:09.079 08:17:21 setup.sh.devices -- scripts/common.sh@355 -- # echo 2 00:05:09.079 08:17:21 setup.sh.devices -- scripts/common.sh@366 -- # ver2[v]=2 00:05:09.079 08:17:21 setup.sh.devices -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:09.079 08:17:21 setup.sh.devices -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:09.079 08:17:21 setup.sh.devices -- scripts/common.sh@368 -- # return 0 00:05:09.079 08:17:21 setup.sh.devices -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:09.079 08:17:21 setup.sh.devices -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:05:09.079 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:09.079 --rc genhtml_branch_coverage=1 00:05:09.079 --rc genhtml_function_coverage=1 00:05:09.079 --rc genhtml_legend=1 00:05:09.079 --rc geninfo_all_blocks=1 00:05:09.079 --rc geninfo_unexecuted_blocks=1 00:05:09.079 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:09.079 ' 00:05:09.079 08:17:21 setup.sh.devices -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:05:09.079 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:09.079 --rc genhtml_branch_coverage=1 00:05:09.079 --rc genhtml_function_coverage=1 00:05:09.079 --rc genhtml_legend=1 00:05:09.079 --rc geninfo_all_blocks=1 00:05:09.079 --rc geninfo_unexecuted_blocks=1 00:05:09.079 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:09.079 ' 00:05:09.079 08:17:21 setup.sh.devices -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:05:09.079 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:09.079 --rc genhtml_branch_coverage=1 00:05:09.079 --rc genhtml_function_coverage=1 00:05:09.079 --rc genhtml_legend=1 00:05:09.079 --rc geninfo_all_blocks=1 00:05:09.079 --rc geninfo_unexecuted_blocks=1 00:05:09.079 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:09.079 ' 00:05:09.079 08:17:21 setup.sh.devices -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:05:09.079 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:09.079 --rc genhtml_branch_coverage=1 00:05:09.079 --rc genhtml_function_coverage=1 00:05:09.079 --rc genhtml_legend=1 00:05:09.079 --rc geninfo_all_blocks=1 00:05:09.079 --rc geninfo_unexecuted_blocks=1 00:05:09.079 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:09.079 ' 00:05:09.079 08:17:21 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:09.079 08:17:21 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:09.079 08:17:21 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:09.079 08:17:21 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:12.366 08:17:25 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:12.366 08:17:25 setup.sh.devices -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:05:12.366 08:17:25 setup.sh.devices -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:05:12.366 08:17:25 setup.sh.devices -- common/autotest_common.sh@1656 -- # local nvme bdf 00:05:12.366 08:17:25 setup.sh.devices -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:05:12.366 08:17:25 setup.sh.devices -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:05:12.366 08:17:25 setup.sh.devices -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:05:12.366 08:17:25 setup.sh.devices -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:12.366 08:17:25 setup.sh.devices -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:05:12.366 08:17:25 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:12.366 08:17:25 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:12.366 08:17:25 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:12.366 08:17:25 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:12.366 08:17:25 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:12.366 08:17:25 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:12.366 08:17:25 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:12.366 08:17:25 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:12.366 08:17:25 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:05:12.366 08:17:25 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:05:12.366 08:17:25 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:12.366 08:17:25 setup.sh.devices -- scripts/common.sh@381 -- # local block=nvme0n1 pt 00:05:12.366 08:17:25 setup.sh.devices -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:12.366 No valid GPT data, bailing 00:05:12.366 08:17:25 setup.sh.devices -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:12.366 08:17:25 setup.sh.devices -- scripts/common.sh@394 -- # pt= 00:05:12.366 08:17:25 setup.sh.devices -- scripts/common.sh@395 -- # return 1 00:05:12.366 08:17:25 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:12.367 08:17:25 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:12.367 08:17:25 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:12.367 08:17:25 setup.sh.devices -- setup/common.sh@80 -- # echo 1600321314816 00:05:12.367 08:17:25 setup.sh.devices -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:05:12.367 08:17:25 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:12.367 08:17:25 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:05:12.367 08:17:25 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:12.367 08:17:25 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:12.367 08:17:25 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:12.367 08:17:25 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:12.367 08:17:25 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:12.367 08:17:25 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:12.367 ************************************ 00:05:12.367 START TEST nvme_mount 00:05:12.367 ************************************ 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1125 -- # nvme_mount 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:12.367 08:17:25 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:13.304 Creating new GPT entries in memory. 00:05:13.304 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:13.304 other utilities. 00:05:13.304 08:17:26 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:13.304 08:17:26 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:13.304 08:17:26 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:13.304 08:17:26 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:13.304 08:17:26 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:14.242 Creating new GPT entries in memory. 00:05:14.242 The operation has completed successfully. 00:05:14.242 08:17:27 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:14.242 08:17:27 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:14.242 08:17:27 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 959214 00:05:14.501 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:14.501 08:17:27 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:14.501 08:17:27 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:14.501 08:17:27 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:14.501 08:17:27 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:14.501 08:17:27 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:14.501 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:14.502 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:14.502 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:14.502 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:14.502 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:14.502 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:14.502 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:14.502 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:14.502 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:14.502 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.502 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:14.502 08:17:27 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:14.502 08:17:27 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:14.502 08:17:27 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:17.788 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:17.788 08:17:30 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:18.046 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:18.046 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:18.046 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:18.046 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:18.046 08:17:31 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:21.331 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:21.332 08:17:34 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:24.737 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:24.737 00:05:24.737 real 0m12.383s 00:05:24.737 user 0m3.670s 00:05:24.737 sys 0m6.629s 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:24.737 08:17:37 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:24.737 ************************************ 00:05:24.737 END TEST nvme_mount 00:05:24.737 ************************************ 00:05:24.737 08:17:37 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:24.737 08:17:37 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:24.737 08:17:37 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:24.737 08:17:37 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:24.737 ************************************ 00:05:24.737 START TEST dm_mount 00:05:24.737 ************************************ 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- common/autotest_common.sh@1125 -- # dm_mount 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:24.737 08:17:37 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:25.671 Creating new GPT entries in memory. 00:05:25.671 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:25.671 other utilities. 00:05:25.671 08:17:38 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:25.671 08:17:38 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:25.671 08:17:38 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:25.671 08:17:38 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:25.671 08:17:38 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:27.047 Creating new GPT entries in memory. 00:05:27.047 The operation has completed successfully. 00:05:27.047 08:17:39 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:27.047 08:17:39 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:27.047 08:17:39 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:27.047 08:17:39 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:27.047 08:17:39 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:27.983 The operation has completed successfully. 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 963648 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:27.983 08:17:40 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:31.270 08:17:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.270 08:17:44 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:31.270 08:17:44 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:31.270 08:17:44 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:31.270 08:17:44 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:31.270 08:17:44 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:31.270 08:17:44 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:31.270 08:17:44 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:31.270 08:17:44 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:31.270 08:17:44 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:31.270 08:17:44 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:31.270 08:17:44 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:31.270 08:17:44 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:31.270 08:17:44 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:31.270 08:17:44 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:31.270 08:17:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.270 08:17:44 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:31.270 08:17:44 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:31.270 08:17:44 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:31.270 08:17:44 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:33.873 08:17:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.132 08:17:47 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:34.132 08:17:47 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:34.132 08:17:47 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:34.132 08:17:47 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:34.132 08:17:47 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:34.132 08:17:47 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:34.132 08:17:47 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:34.132 08:17:47 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:34.132 08:17:47 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:34.132 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:34.133 08:17:47 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:34.133 08:17:47 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:34.133 00:05:34.133 real 0m9.351s 00:05:34.133 user 0m2.154s 00:05:34.133 sys 0m4.169s 00:05:34.133 08:17:47 setup.sh.devices.dm_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:34.133 08:17:47 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:34.133 ************************************ 00:05:34.133 END TEST dm_mount 00:05:34.133 ************************************ 00:05:34.133 08:17:47 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:34.133 08:17:47 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:34.133 08:17:47 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:34.133 08:17:47 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:34.133 08:17:47 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:34.133 08:17:47 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:34.133 08:17:47 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:34.391 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:34.391 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:34.391 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:34.391 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:34.391 08:17:47 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:34.391 08:17:47 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:34.391 08:17:47 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:34.391 08:17:47 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:34.391 08:17:47 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:34.391 08:17:47 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:34.391 08:17:47 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:34.391 00:05:34.391 real 0m25.782s 00:05:34.391 user 0m7.134s 00:05:34.391 sys 0m13.416s 00:05:34.391 08:17:47 setup.sh.devices -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:34.391 08:17:47 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:34.391 ************************************ 00:05:34.391 END TEST devices 00:05:34.391 ************************************ 00:05:34.391 00:05:34.391 real 1m27.838s 00:05:34.391 user 0m26.928s 00:05:34.391 sys 0m49.509s 00:05:34.391 08:17:47 setup.sh -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:34.391 08:17:47 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:34.391 ************************************ 00:05:34.391 END TEST setup.sh 00:05:34.391 ************************************ 00:05:34.391 08:17:47 -- spdk/autotest.sh@115 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:05:37.673 Hugepages 00:05:37.673 node hugesize free / total 00:05:37.673 node0 1048576kB 0 / 0 00:05:37.673 node0 2048kB 1024 / 1024 00:05:37.673 node1 1048576kB 0 / 0 00:05:37.673 node1 2048kB 1024 / 1024 00:05:37.673 00:05:37.673 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:37.673 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:37.673 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:37.673 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:37.673 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:37.673 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:37.673 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:37.673 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:37.673 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:37.673 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:37.673 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:37.673 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:37.673 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:37.673 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:37.673 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:37.673 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:37.673 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:37.673 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:05:37.673 08:17:50 -- spdk/autotest.sh@117 -- # uname -s 00:05:37.673 08:17:50 -- spdk/autotest.sh@117 -- # [[ Linux == Linux ]] 00:05:37.673 08:17:50 -- spdk/autotest.sh@119 -- # nvme_namespace_revert 00:05:37.673 08:17:50 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:40.959 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:40.959 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:40.959 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:40.959 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:40.959 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:40.959 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:40.959 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:40.959 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:40.959 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:40.959 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:40.959 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:40.959 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:40.959 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:40.959 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:40.959 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:40.959 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:42.865 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:42.865 08:17:55 -- common/autotest_common.sh@1515 -- # sleep 1 00:05:43.802 08:17:56 -- common/autotest_common.sh@1516 -- # bdfs=() 00:05:43.802 08:17:56 -- common/autotest_common.sh@1516 -- # local bdfs 00:05:43.802 08:17:56 -- common/autotest_common.sh@1518 -- # bdfs=($(get_nvme_bdfs)) 00:05:43.802 08:17:56 -- common/autotest_common.sh@1518 -- # get_nvme_bdfs 00:05:43.802 08:17:56 -- common/autotest_common.sh@1496 -- # bdfs=() 00:05:43.802 08:17:56 -- common/autotest_common.sh@1496 -- # local bdfs 00:05:43.802 08:17:56 -- common/autotest_common.sh@1497 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:43.802 08:17:56 -- common/autotest_common.sh@1497 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:43.802 08:17:56 -- common/autotest_common.sh@1497 -- # jq -r '.config[].params.traddr' 00:05:43.802 08:17:56 -- common/autotest_common.sh@1498 -- # (( 1 == 0 )) 00:05:43.802 08:17:56 -- common/autotest_common.sh@1502 -- # printf '%s\n' 0000:d8:00.0 00:05:43.802 08:17:56 -- common/autotest_common.sh@1520 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:46.352 Waiting for block devices as requested 00:05:46.352 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:46.610 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:46.610 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:46.610 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:46.610 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:46.869 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:46.869 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:46.869 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:46.869 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:47.128 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:47.128 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:47.128 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:47.387 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:47.387 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:47.387 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:47.645 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:47.645 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:05:47.904 08:18:00 -- common/autotest_common.sh@1522 -- # for bdf in "${bdfs[@]}" 00:05:47.904 08:18:00 -- common/autotest_common.sh@1523 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:05:47.904 08:18:00 -- common/autotest_common.sh@1485 -- # readlink -f /sys/class/nvme/nvme0 00:05:47.904 08:18:00 -- common/autotest_common.sh@1485 -- # grep 0000:d8:00.0/nvme/nvme 00:05:47.904 08:18:00 -- common/autotest_common.sh@1485 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:47.904 08:18:00 -- common/autotest_common.sh@1486 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:05:47.904 08:18:00 -- common/autotest_common.sh@1490 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:47.904 08:18:00 -- common/autotest_common.sh@1490 -- # printf '%s\n' nvme0 00:05:47.904 08:18:00 -- common/autotest_common.sh@1523 -- # nvme_ctrlr=/dev/nvme0 00:05:47.904 08:18:00 -- common/autotest_common.sh@1524 -- # [[ -z /dev/nvme0 ]] 00:05:47.904 08:18:00 -- common/autotest_common.sh@1529 -- # nvme id-ctrl /dev/nvme0 00:05:47.904 08:18:00 -- common/autotest_common.sh@1529 -- # grep oacs 00:05:47.904 08:18:00 -- common/autotest_common.sh@1529 -- # cut -d: -f2 00:05:47.904 08:18:00 -- common/autotest_common.sh@1529 -- # oacs=' 0xe' 00:05:47.904 08:18:00 -- common/autotest_common.sh@1530 -- # oacs_ns_manage=8 00:05:47.904 08:18:00 -- common/autotest_common.sh@1532 -- # [[ 8 -ne 0 ]] 00:05:47.904 08:18:00 -- common/autotest_common.sh@1538 -- # nvme id-ctrl /dev/nvme0 00:05:47.904 08:18:00 -- common/autotest_common.sh@1538 -- # grep unvmcap 00:05:47.904 08:18:00 -- common/autotest_common.sh@1538 -- # cut -d: -f2 00:05:47.904 08:18:00 -- common/autotest_common.sh@1538 -- # unvmcap=' 0' 00:05:47.904 08:18:00 -- common/autotest_common.sh@1539 -- # [[ 0 -eq 0 ]] 00:05:47.904 08:18:00 -- common/autotest_common.sh@1541 -- # continue 00:05:47.904 08:18:00 -- spdk/autotest.sh@122 -- # timing_exit pre_cleanup 00:05:47.904 08:18:00 -- common/autotest_common.sh@730 -- # xtrace_disable 00:05:47.904 08:18:00 -- common/autotest_common.sh@10 -- # set +x 00:05:47.904 08:18:00 -- spdk/autotest.sh@125 -- # timing_enter afterboot 00:05:47.904 08:18:00 -- common/autotest_common.sh@724 -- # xtrace_disable 00:05:47.904 08:18:00 -- common/autotest_common.sh@10 -- # set +x 00:05:47.904 08:18:00 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:51.194 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:51.194 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:51.194 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:51.194 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:51.194 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:51.194 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:51.194 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:51.194 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:51.194 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:51.454 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:51.454 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:51.454 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:51.454 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:51.454 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:51.454 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:51.454 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:52.832 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:53.091 08:18:06 -- spdk/autotest.sh@127 -- # timing_exit afterboot 00:05:53.091 08:18:06 -- common/autotest_common.sh@730 -- # xtrace_disable 00:05:53.091 08:18:06 -- common/autotest_common.sh@10 -- # set +x 00:05:53.091 08:18:06 -- spdk/autotest.sh@131 -- # opal_revert_cleanup 00:05:53.091 08:18:06 -- common/autotest_common.sh@1576 -- # mapfile -t bdfs 00:05:53.091 08:18:06 -- common/autotest_common.sh@1576 -- # get_nvme_bdfs_by_id 0x0a54 00:05:53.091 08:18:06 -- common/autotest_common.sh@1561 -- # bdfs=() 00:05:53.091 08:18:06 -- common/autotest_common.sh@1561 -- # _bdfs=() 00:05:53.091 08:18:06 -- common/autotest_common.sh@1561 -- # local bdfs _bdfs 00:05:53.091 08:18:06 -- common/autotest_common.sh@1562 -- # _bdfs=($(get_nvme_bdfs)) 00:05:53.091 08:18:06 -- common/autotest_common.sh@1562 -- # get_nvme_bdfs 00:05:53.091 08:18:06 -- common/autotest_common.sh@1496 -- # bdfs=() 00:05:53.091 08:18:06 -- common/autotest_common.sh@1496 -- # local bdfs 00:05:53.091 08:18:06 -- common/autotest_common.sh@1497 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:53.091 08:18:06 -- common/autotest_common.sh@1497 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:53.091 08:18:06 -- common/autotest_common.sh@1497 -- # jq -r '.config[].params.traddr' 00:05:53.091 08:18:06 -- common/autotest_common.sh@1498 -- # (( 1 == 0 )) 00:05:53.091 08:18:06 -- common/autotest_common.sh@1502 -- # printf '%s\n' 0000:d8:00.0 00:05:53.091 08:18:06 -- common/autotest_common.sh@1563 -- # for bdf in "${_bdfs[@]}" 00:05:53.091 08:18:06 -- common/autotest_common.sh@1564 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:05:53.091 08:18:06 -- common/autotest_common.sh@1564 -- # device=0x0a54 00:05:53.091 08:18:06 -- common/autotest_common.sh@1565 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:53.091 08:18:06 -- common/autotest_common.sh@1566 -- # bdfs+=($bdf) 00:05:53.091 08:18:06 -- common/autotest_common.sh@1570 -- # (( 1 > 0 )) 00:05:53.091 08:18:06 -- common/autotest_common.sh@1571 -- # printf '%s\n' 0000:d8:00.0 00:05:53.091 08:18:06 -- common/autotest_common.sh@1577 -- # [[ -z 0000:d8:00.0 ]] 00:05:53.091 08:18:06 -- common/autotest_common.sh@1581 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:53.091 08:18:06 -- common/autotest_common.sh@1582 -- # spdk_tgt_pid=973724 00:05:53.091 08:18:06 -- common/autotest_common.sh@1583 -- # waitforlisten 973724 00:05:53.091 08:18:06 -- common/autotest_common.sh@831 -- # '[' -z 973724 ']' 00:05:53.091 08:18:06 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:53.091 08:18:06 -- common/autotest_common.sh@836 -- # local max_retries=100 00:05:53.091 08:18:06 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:53.091 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:53.091 08:18:06 -- common/autotest_common.sh@840 -- # xtrace_disable 00:05:53.091 08:18:06 -- common/autotest_common.sh@10 -- # set +x 00:05:53.091 [2024-11-17 08:18:06.162128] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:05:53.091 [2024-11-17 08:18:06.162172] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid973724 ] 00:05:53.091 [2024-11-17 08:18:06.225846] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:53.349 [2024-11-17 08:18:06.264083] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.350 08:18:06 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:05:53.350 08:18:06 -- common/autotest_common.sh@864 -- # return 0 00:05:53.350 08:18:06 -- common/autotest_common.sh@1585 -- # bdf_id=0 00:05:53.350 08:18:06 -- common/autotest_common.sh@1586 -- # for bdf in "${bdfs[@]}" 00:05:53.350 08:18:06 -- common/autotest_common.sh@1587 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:05:56.636 nvme0n1 00:05:56.636 08:18:09 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:56.636 [2024-11-17 08:18:09.642529] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:56.636 request: 00:05:56.636 { 00:05:56.636 "nvme_ctrlr_name": "nvme0", 00:05:56.636 "password": "test", 00:05:56.636 "method": "bdev_nvme_opal_revert", 00:05:56.636 "req_id": 1 00:05:56.636 } 00:05:56.636 Got JSON-RPC error response 00:05:56.636 response: 00:05:56.636 { 00:05:56.636 "code": -32602, 00:05:56.636 "message": "Invalid parameters" 00:05:56.636 } 00:05:56.636 08:18:09 -- common/autotest_common.sh@1589 -- # true 00:05:56.636 08:18:09 -- common/autotest_common.sh@1590 -- # (( ++bdf_id )) 00:05:56.636 08:18:09 -- common/autotest_common.sh@1593 -- # killprocess 973724 00:05:56.636 08:18:09 -- common/autotest_common.sh@950 -- # '[' -z 973724 ']' 00:05:56.636 08:18:09 -- common/autotest_common.sh@954 -- # kill -0 973724 00:05:56.636 08:18:09 -- common/autotest_common.sh@955 -- # uname 00:05:56.637 08:18:09 -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:05:56.637 08:18:09 -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 973724 00:05:56.637 08:18:09 -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:05:56.637 08:18:09 -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:05:56.637 08:18:09 -- common/autotest_common.sh@968 -- # echo 'killing process with pid 973724' 00:05:56.637 killing process with pid 973724 00:05:56.637 08:18:09 -- common/autotest_common.sh@969 -- # kill 973724 00:05:56.637 08:18:09 -- common/autotest_common.sh@974 -- # wait 973724 00:05:59.171 08:18:11 -- spdk/autotest.sh@137 -- # '[' 0 -eq 1 ']' 00:05:59.171 08:18:11 -- spdk/autotest.sh@141 -- # '[' 1 -eq 1 ']' 00:05:59.171 08:18:11 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:05:59.171 08:18:11 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:05:59.171 08:18:11 -- spdk/autotest.sh@149 -- # timing_enter lib 00:05:59.171 08:18:11 -- common/autotest_common.sh@724 -- # xtrace_disable 00:05:59.171 08:18:11 -- common/autotest_common.sh@10 -- # set +x 00:05:59.171 08:18:11 -- spdk/autotest.sh@151 -- # [[ 0 -eq 1 ]] 00:05:59.171 08:18:11 -- spdk/autotest.sh@155 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:59.171 08:18:11 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:59.171 08:18:11 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:59.171 08:18:11 -- common/autotest_common.sh@10 -- # set +x 00:05:59.171 ************************************ 00:05:59.171 START TEST env 00:05:59.171 ************************************ 00:05:59.171 08:18:11 env -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:59.171 * Looking for test storage... 00:05:59.171 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:59.171 08:18:12 env -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:05:59.171 08:18:12 env -- common/autotest_common.sh@1681 -- # lcov --version 00:05:59.171 08:18:12 env -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:05:59.171 08:18:12 env -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:05:59.171 08:18:12 env -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:59.171 08:18:12 env -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:59.172 08:18:12 env -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:59.172 08:18:12 env -- scripts/common.sh@336 -- # IFS=.-: 00:05:59.172 08:18:12 env -- scripts/common.sh@336 -- # read -ra ver1 00:05:59.172 08:18:12 env -- scripts/common.sh@337 -- # IFS=.-: 00:05:59.172 08:18:12 env -- scripts/common.sh@337 -- # read -ra ver2 00:05:59.172 08:18:12 env -- scripts/common.sh@338 -- # local 'op=<' 00:05:59.172 08:18:12 env -- scripts/common.sh@340 -- # ver1_l=2 00:05:59.172 08:18:12 env -- scripts/common.sh@341 -- # ver2_l=1 00:05:59.172 08:18:12 env -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:59.172 08:18:12 env -- scripts/common.sh@344 -- # case "$op" in 00:05:59.172 08:18:12 env -- scripts/common.sh@345 -- # : 1 00:05:59.172 08:18:12 env -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:59.172 08:18:12 env -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:59.172 08:18:12 env -- scripts/common.sh@365 -- # decimal 1 00:05:59.172 08:18:12 env -- scripts/common.sh@353 -- # local d=1 00:05:59.172 08:18:12 env -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:59.172 08:18:12 env -- scripts/common.sh@355 -- # echo 1 00:05:59.172 08:18:12 env -- scripts/common.sh@365 -- # ver1[v]=1 00:05:59.172 08:18:12 env -- scripts/common.sh@366 -- # decimal 2 00:05:59.172 08:18:12 env -- scripts/common.sh@353 -- # local d=2 00:05:59.172 08:18:12 env -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:59.172 08:18:12 env -- scripts/common.sh@355 -- # echo 2 00:05:59.172 08:18:12 env -- scripts/common.sh@366 -- # ver2[v]=2 00:05:59.172 08:18:12 env -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:59.172 08:18:12 env -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:59.172 08:18:12 env -- scripts/common.sh@368 -- # return 0 00:05:59.172 08:18:12 env -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:59.172 08:18:12 env -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:05:59.172 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:59.172 --rc genhtml_branch_coverage=1 00:05:59.172 --rc genhtml_function_coverage=1 00:05:59.172 --rc genhtml_legend=1 00:05:59.172 --rc geninfo_all_blocks=1 00:05:59.172 --rc geninfo_unexecuted_blocks=1 00:05:59.172 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:59.172 ' 00:05:59.172 08:18:12 env -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:05:59.172 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:59.172 --rc genhtml_branch_coverage=1 00:05:59.172 --rc genhtml_function_coverage=1 00:05:59.172 --rc genhtml_legend=1 00:05:59.172 --rc geninfo_all_blocks=1 00:05:59.172 --rc geninfo_unexecuted_blocks=1 00:05:59.172 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:59.172 ' 00:05:59.172 08:18:12 env -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:05:59.172 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:59.172 --rc genhtml_branch_coverage=1 00:05:59.172 --rc genhtml_function_coverage=1 00:05:59.172 --rc genhtml_legend=1 00:05:59.172 --rc geninfo_all_blocks=1 00:05:59.172 --rc geninfo_unexecuted_blocks=1 00:05:59.172 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:59.172 ' 00:05:59.172 08:18:12 env -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:05:59.172 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:59.172 --rc genhtml_branch_coverage=1 00:05:59.172 --rc genhtml_function_coverage=1 00:05:59.172 --rc genhtml_legend=1 00:05:59.172 --rc geninfo_all_blocks=1 00:05:59.172 --rc geninfo_unexecuted_blocks=1 00:05:59.172 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:59.172 ' 00:05:59.172 08:18:12 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:59.172 08:18:12 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:59.172 08:18:12 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:59.172 08:18:12 env -- common/autotest_common.sh@10 -- # set +x 00:05:59.172 ************************************ 00:05:59.172 START TEST env_memory 00:05:59.172 ************************************ 00:05:59.172 08:18:12 env.env_memory -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:59.172 00:05:59.172 00:05:59.172 CUnit - A unit testing framework for C - Version 2.1-3 00:05:59.172 http://cunit.sourceforge.net/ 00:05:59.172 00:05:59.172 00:05:59.172 Suite: memory 00:05:59.172 Test: alloc and free memory map ...[2024-11-17 08:18:12.171160] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:59.172 passed 00:05:59.172 Test: mem map translation ...[2024-11-17 08:18:12.183889] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:59.172 [2024-11-17 08:18:12.183906] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:59.172 [2024-11-17 08:18:12.183936] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 589:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:59.172 [2024-11-17 08:18:12.183945] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 605:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:59.172 passed 00:05:59.172 Test: mem map registration ...[2024-11-17 08:18:12.204061] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=200000 len=1234 00:05:59.172 [2024-11-17 08:18:12.204077] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=4d2 len=2097152 00:05:59.172 passed 00:05:59.172 Test: mem map adjacent registrations ...passed 00:05:59.172 00:05:59.172 Run Summary: Type Total Ran Passed Failed Inactive 00:05:59.172 suites 1 1 n/a 0 0 00:05:59.172 tests 4 4 4 0 0 00:05:59.172 asserts 152 152 152 0 n/a 00:05:59.172 00:05:59.172 Elapsed time = 0.082 seconds 00:05:59.172 00:05:59.172 real 0m0.096s 00:05:59.172 user 0m0.081s 00:05:59.172 sys 0m0.015s 00:05:59.172 08:18:12 env.env_memory -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:59.172 08:18:12 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:05:59.172 ************************************ 00:05:59.172 END TEST env_memory 00:05:59.172 ************************************ 00:05:59.172 08:18:12 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:59.172 08:18:12 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:59.172 08:18:12 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:59.172 08:18:12 env -- common/autotest_common.sh@10 -- # set +x 00:05:59.432 ************************************ 00:05:59.432 START TEST env_vtophys 00:05:59.432 ************************************ 00:05:59.432 08:18:12 env.env_vtophys -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:59.432 EAL: lib.eal log level changed from notice to debug 00:05:59.432 EAL: Detected lcore 0 as core 0 on socket 0 00:05:59.432 EAL: Detected lcore 1 as core 1 on socket 0 00:05:59.432 EAL: Detected lcore 2 as core 2 on socket 0 00:05:59.432 EAL: Detected lcore 3 as core 3 on socket 0 00:05:59.432 EAL: Detected lcore 4 as core 4 on socket 0 00:05:59.432 EAL: Detected lcore 5 as core 5 on socket 0 00:05:59.432 EAL: Detected lcore 6 as core 6 on socket 0 00:05:59.432 EAL: Detected lcore 7 as core 8 on socket 0 00:05:59.432 EAL: Detected lcore 8 as core 9 on socket 0 00:05:59.432 EAL: Detected lcore 9 as core 10 on socket 0 00:05:59.432 EAL: Detected lcore 10 as core 11 on socket 0 00:05:59.432 EAL: Detected lcore 11 as core 12 on socket 0 00:05:59.432 EAL: Detected lcore 12 as core 13 on socket 0 00:05:59.432 EAL: Detected lcore 13 as core 14 on socket 0 00:05:59.432 EAL: Detected lcore 14 as core 16 on socket 0 00:05:59.432 EAL: Detected lcore 15 as core 17 on socket 0 00:05:59.432 EAL: Detected lcore 16 as core 18 on socket 0 00:05:59.432 EAL: Detected lcore 17 as core 19 on socket 0 00:05:59.432 EAL: Detected lcore 18 as core 20 on socket 0 00:05:59.432 EAL: Detected lcore 19 as core 21 on socket 0 00:05:59.432 EAL: Detected lcore 20 as core 22 on socket 0 00:05:59.432 EAL: Detected lcore 21 as core 24 on socket 0 00:05:59.432 EAL: Detected lcore 22 as core 25 on socket 0 00:05:59.432 EAL: Detected lcore 23 as core 26 on socket 0 00:05:59.432 EAL: Detected lcore 24 as core 27 on socket 0 00:05:59.432 EAL: Detected lcore 25 as core 28 on socket 0 00:05:59.432 EAL: Detected lcore 26 as core 29 on socket 0 00:05:59.432 EAL: Detected lcore 27 as core 30 on socket 0 00:05:59.432 EAL: Detected lcore 28 as core 0 on socket 1 00:05:59.432 EAL: Detected lcore 29 as core 1 on socket 1 00:05:59.432 EAL: Detected lcore 30 as core 2 on socket 1 00:05:59.432 EAL: Detected lcore 31 as core 3 on socket 1 00:05:59.432 EAL: Detected lcore 32 as core 4 on socket 1 00:05:59.432 EAL: Detected lcore 33 as core 5 on socket 1 00:05:59.432 EAL: Detected lcore 34 as core 6 on socket 1 00:05:59.432 EAL: Detected lcore 35 as core 8 on socket 1 00:05:59.432 EAL: Detected lcore 36 as core 9 on socket 1 00:05:59.432 EAL: Detected lcore 37 as core 10 on socket 1 00:05:59.432 EAL: Detected lcore 38 as core 11 on socket 1 00:05:59.432 EAL: Detected lcore 39 as core 12 on socket 1 00:05:59.432 EAL: Detected lcore 40 as core 13 on socket 1 00:05:59.432 EAL: Detected lcore 41 as core 14 on socket 1 00:05:59.432 EAL: Detected lcore 42 as core 16 on socket 1 00:05:59.432 EAL: Detected lcore 43 as core 17 on socket 1 00:05:59.432 EAL: Detected lcore 44 as core 18 on socket 1 00:05:59.432 EAL: Detected lcore 45 as core 19 on socket 1 00:05:59.432 EAL: Detected lcore 46 as core 20 on socket 1 00:05:59.432 EAL: Detected lcore 47 as core 21 on socket 1 00:05:59.432 EAL: Detected lcore 48 as core 22 on socket 1 00:05:59.432 EAL: Detected lcore 49 as core 24 on socket 1 00:05:59.432 EAL: Detected lcore 50 as core 25 on socket 1 00:05:59.432 EAL: Detected lcore 51 as core 26 on socket 1 00:05:59.432 EAL: Detected lcore 52 as core 27 on socket 1 00:05:59.432 EAL: Detected lcore 53 as core 28 on socket 1 00:05:59.432 EAL: Detected lcore 54 as core 29 on socket 1 00:05:59.433 EAL: Detected lcore 55 as core 30 on socket 1 00:05:59.433 EAL: Detected lcore 56 as core 0 on socket 0 00:05:59.433 EAL: Detected lcore 57 as core 1 on socket 0 00:05:59.433 EAL: Detected lcore 58 as core 2 on socket 0 00:05:59.433 EAL: Detected lcore 59 as core 3 on socket 0 00:05:59.433 EAL: Detected lcore 60 as core 4 on socket 0 00:05:59.433 EAL: Detected lcore 61 as core 5 on socket 0 00:05:59.433 EAL: Detected lcore 62 as core 6 on socket 0 00:05:59.433 EAL: Detected lcore 63 as core 8 on socket 0 00:05:59.433 EAL: Detected lcore 64 as core 9 on socket 0 00:05:59.433 EAL: Detected lcore 65 as core 10 on socket 0 00:05:59.433 EAL: Detected lcore 66 as core 11 on socket 0 00:05:59.433 EAL: Detected lcore 67 as core 12 on socket 0 00:05:59.433 EAL: Detected lcore 68 as core 13 on socket 0 00:05:59.433 EAL: Detected lcore 69 as core 14 on socket 0 00:05:59.433 EAL: Detected lcore 70 as core 16 on socket 0 00:05:59.433 EAL: Detected lcore 71 as core 17 on socket 0 00:05:59.433 EAL: Detected lcore 72 as core 18 on socket 0 00:05:59.433 EAL: Detected lcore 73 as core 19 on socket 0 00:05:59.433 EAL: Detected lcore 74 as core 20 on socket 0 00:05:59.433 EAL: Detected lcore 75 as core 21 on socket 0 00:05:59.433 EAL: Detected lcore 76 as core 22 on socket 0 00:05:59.433 EAL: Detected lcore 77 as core 24 on socket 0 00:05:59.433 EAL: Detected lcore 78 as core 25 on socket 0 00:05:59.433 EAL: Detected lcore 79 as core 26 on socket 0 00:05:59.433 EAL: Detected lcore 80 as core 27 on socket 0 00:05:59.433 EAL: Detected lcore 81 as core 28 on socket 0 00:05:59.433 EAL: Detected lcore 82 as core 29 on socket 0 00:05:59.433 EAL: Detected lcore 83 as core 30 on socket 0 00:05:59.433 EAL: Detected lcore 84 as core 0 on socket 1 00:05:59.433 EAL: Detected lcore 85 as core 1 on socket 1 00:05:59.433 EAL: Detected lcore 86 as core 2 on socket 1 00:05:59.433 EAL: Detected lcore 87 as core 3 on socket 1 00:05:59.433 EAL: Detected lcore 88 as core 4 on socket 1 00:05:59.433 EAL: Detected lcore 89 as core 5 on socket 1 00:05:59.433 EAL: Detected lcore 90 as core 6 on socket 1 00:05:59.433 EAL: Detected lcore 91 as core 8 on socket 1 00:05:59.433 EAL: Detected lcore 92 as core 9 on socket 1 00:05:59.433 EAL: Detected lcore 93 as core 10 on socket 1 00:05:59.433 EAL: Detected lcore 94 as core 11 on socket 1 00:05:59.433 EAL: Detected lcore 95 as core 12 on socket 1 00:05:59.433 EAL: Detected lcore 96 as core 13 on socket 1 00:05:59.433 EAL: Detected lcore 97 as core 14 on socket 1 00:05:59.433 EAL: Detected lcore 98 as core 16 on socket 1 00:05:59.433 EAL: Detected lcore 99 as core 17 on socket 1 00:05:59.433 EAL: Detected lcore 100 as core 18 on socket 1 00:05:59.433 EAL: Detected lcore 101 as core 19 on socket 1 00:05:59.433 EAL: Detected lcore 102 as core 20 on socket 1 00:05:59.433 EAL: Detected lcore 103 as core 21 on socket 1 00:05:59.433 EAL: Detected lcore 104 as core 22 on socket 1 00:05:59.433 EAL: Detected lcore 105 as core 24 on socket 1 00:05:59.433 EAL: Detected lcore 106 as core 25 on socket 1 00:05:59.433 EAL: Detected lcore 107 as core 26 on socket 1 00:05:59.433 EAL: Detected lcore 108 as core 27 on socket 1 00:05:59.433 EAL: Detected lcore 109 as core 28 on socket 1 00:05:59.433 EAL: Detected lcore 110 as core 29 on socket 1 00:05:59.433 EAL: Detected lcore 111 as core 30 on socket 1 00:05:59.433 EAL: Maximum logical cores by configuration: 128 00:05:59.433 EAL: Detected CPU lcores: 112 00:05:59.433 EAL: Detected NUMA nodes: 2 00:05:59.433 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:05:59.433 EAL: Checking presence of .so 'librte_eal.so.24' 00:05:59.433 EAL: Checking presence of .so 'librte_eal.so' 00:05:59.433 EAL: Detected static linkage of DPDK 00:05:59.433 EAL: No shared files mode enabled, IPC will be disabled 00:05:59.433 EAL: Bus pci wants IOVA as 'DC' 00:05:59.433 EAL: Buses did not request a specific IOVA mode. 00:05:59.433 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:59.433 EAL: Selected IOVA mode 'VA' 00:05:59.433 EAL: Probing VFIO support... 00:05:59.433 EAL: IOMMU type 1 (Type 1) is supported 00:05:59.433 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:59.433 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:59.433 EAL: VFIO support initialized 00:05:59.433 EAL: Ask a virtual area of 0x2e000 bytes 00:05:59.433 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:59.433 EAL: Setting up physically contiguous memory... 00:05:59.433 EAL: Setting maximum number of open files to 524288 00:05:59.433 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:59.433 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:59.433 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:59.433 EAL: Ask a virtual area of 0x61000 bytes 00:05:59.433 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:59.433 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:59.433 EAL: Ask a virtual area of 0x400000000 bytes 00:05:59.433 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:59.433 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:59.433 EAL: Ask a virtual area of 0x61000 bytes 00:05:59.433 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:59.433 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:59.433 EAL: Ask a virtual area of 0x400000000 bytes 00:05:59.433 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:59.433 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:59.433 EAL: Ask a virtual area of 0x61000 bytes 00:05:59.433 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:59.433 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:59.433 EAL: Ask a virtual area of 0x400000000 bytes 00:05:59.433 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:59.433 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:59.433 EAL: Ask a virtual area of 0x61000 bytes 00:05:59.433 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:59.433 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:59.433 EAL: Ask a virtual area of 0x400000000 bytes 00:05:59.433 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:59.433 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:59.433 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:59.433 EAL: Ask a virtual area of 0x61000 bytes 00:05:59.433 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:59.433 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:59.433 EAL: Ask a virtual area of 0x400000000 bytes 00:05:59.433 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:59.433 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:59.433 EAL: Ask a virtual area of 0x61000 bytes 00:05:59.433 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:59.433 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:59.433 EAL: Ask a virtual area of 0x400000000 bytes 00:05:59.433 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:59.433 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:59.433 EAL: Ask a virtual area of 0x61000 bytes 00:05:59.433 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:59.433 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:59.433 EAL: Ask a virtual area of 0x400000000 bytes 00:05:59.433 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:59.433 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:59.433 EAL: Ask a virtual area of 0x61000 bytes 00:05:59.433 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:59.433 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:59.433 EAL: Ask a virtual area of 0x400000000 bytes 00:05:59.433 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:59.433 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:59.433 EAL: Hugepages will be freed exactly as allocated. 00:05:59.433 EAL: No shared files mode enabled, IPC is disabled 00:05:59.433 EAL: No shared files mode enabled, IPC is disabled 00:05:59.433 EAL: TSC frequency is ~2500000 KHz 00:05:59.433 EAL: Main lcore 0 is ready (tid=7f8dc5894a00;cpuset=[0]) 00:05:59.433 EAL: Trying to obtain current memory policy. 00:05:59.433 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:59.433 EAL: Restoring previous memory policy: 0 00:05:59.433 EAL: request: mp_malloc_sync 00:05:59.433 EAL: No shared files mode enabled, IPC is disabled 00:05:59.433 EAL: Heap on socket 0 was expanded by 2MB 00:05:59.433 EAL: No shared files mode enabled, IPC is disabled 00:05:59.433 EAL: Mem event callback 'spdk:(nil)' registered 00:05:59.433 00:05:59.433 00:05:59.433 CUnit - A unit testing framework for C - Version 2.1-3 00:05:59.433 http://cunit.sourceforge.net/ 00:05:59.433 00:05:59.433 00:05:59.433 Suite: components_suite 00:05:59.433 Test: vtophys_malloc_test ...passed 00:05:59.433 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:59.433 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:59.433 EAL: Restoring previous memory policy: 4 00:05:59.433 EAL: Calling mem event callback 'spdk:(nil)' 00:05:59.433 EAL: request: mp_malloc_sync 00:05:59.433 EAL: No shared files mode enabled, IPC is disabled 00:05:59.433 EAL: Heap on socket 0 was expanded by 4MB 00:05:59.433 EAL: Calling mem event callback 'spdk:(nil)' 00:05:59.433 EAL: request: mp_malloc_sync 00:05:59.433 EAL: No shared files mode enabled, IPC is disabled 00:05:59.433 EAL: Heap on socket 0 was shrunk by 4MB 00:05:59.433 EAL: Trying to obtain current memory policy. 00:05:59.433 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:59.433 EAL: Restoring previous memory policy: 4 00:05:59.433 EAL: Calling mem event callback 'spdk:(nil)' 00:05:59.433 EAL: request: mp_malloc_sync 00:05:59.433 EAL: No shared files mode enabled, IPC is disabled 00:05:59.433 EAL: Heap on socket 0 was expanded by 6MB 00:05:59.433 EAL: Calling mem event callback 'spdk:(nil)' 00:05:59.433 EAL: request: mp_malloc_sync 00:05:59.433 EAL: No shared files mode enabled, IPC is disabled 00:05:59.433 EAL: Heap on socket 0 was shrunk by 6MB 00:05:59.433 EAL: Trying to obtain current memory policy. 00:05:59.433 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:59.433 EAL: Restoring previous memory policy: 4 00:05:59.433 EAL: Calling mem event callback 'spdk:(nil)' 00:05:59.433 EAL: request: mp_malloc_sync 00:05:59.433 EAL: No shared files mode enabled, IPC is disabled 00:05:59.433 EAL: Heap on socket 0 was expanded by 10MB 00:05:59.433 EAL: Calling mem event callback 'spdk:(nil)' 00:05:59.434 EAL: request: mp_malloc_sync 00:05:59.434 EAL: No shared files mode enabled, IPC is disabled 00:05:59.434 EAL: Heap on socket 0 was shrunk by 10MB 00:05:59.434 EAL: Trying to obtain current memory policy. 00:05:59.434 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:59.434 EAL: Restoring previous memory policy: 4 00:05:59.434 EAL: Calling mem event callback 'spdk:(nil)' 00:05:59.434 EAL: request: mp_malloc_sync 00:05:59.434 EAL: No shared files mode enabled, IPC is disabled 00:05:59.434 EAL: Heap on socket 0 was expanded by 18MB 00:05:59.434 EAL: Calling mem event callback 'spdk:(nil)' 00:05:59.434 EAL: request: mp_malloc_sync 00:05:59.434 EAL: No shared files mode enabled, IPC is disabled 00:05:59.434 EAL: Heap on socket 0 was shrunk by 18MB 00:05:59.434 EAL: Trying to obtain current memory policy. 00:05:59.434 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:59.434 EAL: Restoring previous memory policy: 4 00:05:59.434 EAL: Calling mem event callback 'spdk:(nil)' 00:05:59.434 EAL: request: mp_malloc_sync 00:05:59.434 EAL: No shared files mode enabled, IPC is disabled 00:05:59.434 EAL: Heap on socket 0 was expanded by 34MB 00:05:59.434 EAL: Calling mem event callback 'spdk:(nil)' 00:05:59.434 EAL: request: mp_malloc_sync 00:05:59.434 EAL: No shared files mode enabled, IPC is disabled 00:05:59.434 EAL: Heap on socket 0 was shrunk by 34MB 00:05:59.434 EAL: Trying to obtain current memory policy. 00:05:59.434 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:59.434 EAL: Restoring previous memory policy: 4 00:05:59.434 EAL: Calling mem event callback 'spdk:(nil)' 00:05:59.434 EAL: request: mp_malloc_sync 00:05:59.434 EAL: No shared files mode enabled, IPC is disabled 00:05:59.434 EAL: Heap on socket 0 was expanded by 66MB 00:05:59.434 EAL: Calling mem event callback 'spdk:(nil)' 00:05:59.434 EAL: request: mp_malloc_sync 00:05:59.434 EAL: No shared files mode enabled, IPC is disabled 00:05:59.434 EAL: Heap on socket 0 was shrunk by 66MB 00:05:59.434 EAL: Trying to obtain current memory policy. 00:05:59.434 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:59.434 EAL: Restoring previous memory policy: 4 00:05:59.434 EAL: Calling mem event callback 'spdk:(nil)' 00:05:59.434 EAL: request: mp_malloc_sync 00:05:59.434 EAL: No shared files mode enabled, IPC is disabled 00:05:59.434 EAL: Heap on socket 0 was expanded by 130MB 00:05:59.434 EAL: Calling mem event callback 'spdk:(nil)' 00:05:59.434 EAL: request: mp_malloc_sync 00:05:59.434 EAL: No shared files mode enabled, IPC is disabled 00:05:59.434 EAL: Heap on socket 0 was shrunk by 130MB 00:05:59.434 EAL: Trying to obtain current memory policy. 00:05:59.434 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:59.434 EAL: Restoring previous memory policy: 4 00:05:59.434 EAL: Calling mem event callback 'spdk:(nil)' 00:05:59.434 EAL: request: mp_malloc_sync 00:05:59.434 EAL: No shared files mode enabled, IPC is disabled 00:05:59.434 EAL: Heap on socket 0 was expanded by 258MB 00:05:59.693 EAL: Calling mem event callback 'spdk:(nil)' 00:05:59.693 EAL: request: mp_malloc_sync 00:05:59.693 EAL: No shared files mode enabled, IPC is disabled 00:05:59.693 EAL: Heap on socket 0 was shrunk by 258MB 00:05:59.693 EAL: Trying to obtain current memory policy. 00:05:59.693 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:59.693 EAL: Restoring previous memory policy: 4 00:05:59.693 EAL: Calling mem event callback 'spdk:(nil)' 00:05:59.693 EAL: request: mp_malloc_sync 00:05:59.693 EAL: No shared files mode enabled, IPC is disabled 00:05:59.693 EAL: Heap on socket 0 was expanded by 514MB 00:05:59.693 EAL: Calling mem event callback 'spdk:(nil)' 00:05:59.952 EAL: request: mp_malloc_sync 00:05:59.952 EAL: No shared files mode enabled, IPC is disabled 00:05:59.952 EAL: Heap on socket 0 was shrunk by 514MB 00:05:59.952 EAL: Trying to obtain current memory policy. 00:05:59.952 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:59.952 EAL: Restoring previous memory policy: 4 00:05:59.952 EAL: Calling mem event callback 'spdk:(nil)' 00:05:59.952 EAL: request: mp_malloc_sync 00:05:59.952 EAL: No shared files mode enabled, IPC is disabled 00:05:59.952 EAL: Heap on socket 0 was expanded by 1026MB 00:06:00.211 EAL: Calling mem event callback 'spdk:(nil)' 00:06:00.470 EAL: request: mp_malloc_sync 00:06:00.470 EAL: No shared files mode enabled, IPC is disabled 00:06:00.470 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:00.470 passed 00:06:00.470 00:06:00.470 Run Summary: Type Total Ran Passed Failed Inactive 00:06:00.470 suites 1 1 n/a 0 0 00:06:00.470 tests 2 2 2 0 0 00:06:00.470 asserts 497 497 497 0 n/a 00:06:00.470 00:06:00.470 Elapsed time = 0.954 seconds 00:06:00.470 EAL: Calling mem event callback 'spdk:(nil)' 00:06:00.471 EAL: request: mp_malloc_sync 00:06:00.471 EAL: No shared files mode enabled, IPC is disabled 00:06:00.471 EAL: Heap on socket 0 was shrunk by 2MB 00:06:00.471 EAL: No shared files mode enabled, IPC is disabled 00:06:00.471 EAL: No shared files mode enabled, IPC is disabled 00:06:00.471 EAL: No shared files mode enabled, IPC is disabled 00:06:00.471 00:06:00.471 real 0m1.069s 00:06:00.471 user 0m0.621s 00:06:00.471 sys 0m0.423s 00:06:00.471 08:18:13 env.env_vtophys -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:00.471 08:18:13 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:00.471 ************************************ 00:06:00.471 END TEST env_vtophys 00:06:00.471 ************************************ 00:06:00.471 08:18:13 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:00.471 08:18:13 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:00.471 08:18:13 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:00.471 08:18:13 env -- common/autotest_common.sh@10 -- # set +x 00:06:00.471 ************************************ 00:06:00.471 START TEST env_pci 00:06:00.471 ************************************ 00:06:00.471 08:18:13 env.env_pci -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:00.471 00:06:00.471 00:06:00.471 CUnit - A unit testing framework for C - Version 2.1-3 00:06:00.471 http://cunit.sourceforge.net/ 00:06:00.471 00:06:00.471 00:06:00.471 Suite: pci 00:06:00.471 Test: pci_hook ...[2024-11-17 08:18:13.474291] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1050:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 975018 has claimed it 00:06:00.471 EAL: Cannot find device (10000:00:01.0) 00:06:00.471 EAL: Failed to attach device on primary process 00:06:00.471 passed 00:06:00.471 00:06:00.471 Run Summary: Type Total Ran Passed Failed Inactive 00:06:00.471 suites 1 1 n/a 0 0 00:06:00.471 tests 1 1 1 0 0 00:06:00.471 asserts 25 25 25 0 n/a 00:06:00.471 00:06:00.471 Elapsed time = 0.027 seconds 00:06:00.471 00:06:00.471 real 0m0.037s 00:06:00.471 user 0m0.007s 00:06:00.471 sys 0m0.029s 00:06:00.471 08:18:13 env.env_pci -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:00.471 08:18:13 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:00.471 ************************************ 00:06:00.471 END TEST env_pci 00:06:00.471 ************************************ 00:06:00.471 08:18:13 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:00.471 08:18:13 env -- env/env.sh@15 -- # uname 00:06:00.471 08:18:13 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:00.471 08:18:13 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:00.471 08:18:13 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:00.471 08:18:13 env -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:06:00.471 08:18:13 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:00.471 08:18:13 env -- common/autotest_common.sh@10 -- # set +x 00:06:00.471 ************************************ 00:06:00.471 START TEST env_dpdk_post_init 00:06:00.471 ************************************ 00:06:00.471 08:18:13 env.env_dpdk_post_init -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:00.730 EAL: Detected CPU lcores: 112 00:06:00.730 EAL: Detected NUMA nodes: 2 00:06:00.730 EAL: Detected static linkage of DPDK 00:06:00.730 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:00.730 EAL: Selected IOVA mode 'VA' 00:06:00.730 EAL: VFIO support initialized 00:06:00.730 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:00.730 EAL: Using IOMMU type 1 (Type 1) 00:06:01.675 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:06:04.961 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:06:04.961 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:06:05.220 Starting DPDK initialization... 00:06:05.220 Starting SPDK post initialization... 00:06:05.220 SPDK NVMe probe 00:06:05.220 Attaching to 0000:d8:00.0 00:06:05.220 Attached to 0000:d8:00.0 00:06:05.220 Cleaning up... 00:06:05.220 00:06:05.220 real 0m4.655s 00:06:05.220 user 0m3.509s 00:06:05.220 sys 0m0.391s 00:06:05.220 08:18:18 env.env_dpdk_post_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:05.220 08:18:18 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:05.220 ************************************ 00:06:05.220 END TEST env_dpdk_post_init 00:06:05.220 ************************************ 00:06:05.220 08:18:18 env -- env/env.sh@26 -- # uname 00:06:05.220 08:18:18 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:05.220 08:18:18 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:05.220 08:18:18 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:05.220 08:18:18 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:05.220 08:18:18 env -- common/autotest_common.sh@10 -- # set +x 00:06:05.220 ************************************ 00:06:05.220 START TEST env_mem_callbacks 00:06:05.220 ************************************ 00:06:05.220 08:18:18 env.env_mem_callbacks -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:05.220 EAL: Detected CPU lcores: 112 00:06:05.220 EAL: Detected NUMA nodes: 2 00:06:05.220 EAL: Detected static linkage of DPDK 00:06:05.220 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:05.480 EAL: Selected IOVA mode 'VA' 00:06:05.480 EAL: VFIO support initialized 00:06:05.480 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:05.480 00:06:05.480 00:06:05.480 CUnit - A unit testing framework for C - Version 2.1-3 00:06:05.480 http://cunit.sourceforge.net/ 00:06:05.480 00:06:05.480 00:06:05.480 Suite: memory 00:06:05.480 Test: test ... 00:06:05.480 register 0x200000200000 2097152 00:06:05.480 malloc 3145728 00:06:05.480 register 0x200000400000 4194304 00:06:05.480 buf 0x200000500000 len 3145728 PASSED 00:06:05.480 malloc 64 00:06:05.480 buf 0x2000004fff40 len 64 PASSED 00:06:05.480 malloc 4194304 00:06:05.480 register 0x200000800000 6291456 00:06:05.480 buf 0x200000a00000 len 4194304 PASSED 00:06:05.480 free 0x200000500000 3145728 00:06:05.480 free 0x2000004fff40 64 00:06:05.480 unregister 0x200000400000 4194304 PASSED 00:06:05.480 free 0x200000a00000 4194304 00:06:05.480 unregister 0x200000800000 6291456 PASSED 00:06:05.480 malloc 8388608 00:06:05.480 register 0x200000400000 10485760 00:06:05.480 buf 0x200000600000 len 8388608 PASSED 00:06:05.480 free 0x200000600000 8388608 00:06:05.480 unregister 0x200000400000 10485760 PASSED 00:06:05.480 passed 00:06:05.480 00:06:05.480 Run Summary: Type Total Ran Passed Failed Inactive 00:06:05.480 suites 1 1 n/a 0 0 00:06:05.480 tests 1 1 1 0 0 00:06:05.480 asserts 15 15 15 0 n/a 00:06:05.480 00:06:05.480 Elapsed time = 0.005 seconds 00:06:05.480 00:06:05.480 real 0m0.050s 00:06:05.480 user 0m0.013s 00:06:05.480 sys 0m0.037s 00:06:05.480 08:18:18 env.env_mem_callbacks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:05.480 08:18:18 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:05.480 ************************************ 00:06:05.480 END TEST env_mem_callbacks 00:06:05.480 ************************************ 00:06:05.480 00:06:05.480 real 0m6.518s 00:06:05.480 user 0m4.474s 00:06:05.480 sys 0m1.306s 00:06:05.480 08:18:18 env -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:05.480 08:18:18 env -- common/autotest_common.sh@10 -- # set +x 00:06:05.480 ************************************ 00:06:05.480 END TEST env 00:06:05.480 ************************************ 00:06:05.480 08:18:18 -- spdk/autotest.sh@156 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:05.480 08:18:18 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:05.480 08:18:18 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:05.480 08:18:18 -- common/autotest_common.sh@10 -- # set +x 00:06:05.480 ************************************ 00:06:05.480 START TEST rpc 00:06:05.480 ************************************ 00:06:05.480 08:18:18 rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:05.480 * Looking for test storage... 00:06:05.480 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:05.480 08:18:18 rpc -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:05.480 08:18:18 rpc -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:05.480 08:18:18 rpc -- common/autotest_common.sh@1681 -- # lcov --version 00:06:05.740 08:18:18 rpc -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:05.740 08:18:18 rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:05.740 08:18:18 rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:05.740 08:18:18 rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:05.740 08:18:18 rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:05.740 08:18:18 rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:05.740 08:18:18 rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:05.740 08:18:18 rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:05.740 08:18:18 rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:05.740 08:18:18 rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:05.740 08:18:18 rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:05.740 08:18:18 rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:05.740 08:18:18 rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:05.740 08:18:18 rpc -- scripts/common.sh@345 -- # : 1 00:06:05.740 08:18:18 rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:05.740 08:18:18 rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:05.740 08:18:18 rpc -- scripts/common.sh@365 -- # decimal 1 00:06:05.740 08:18:18 rpc -- scripts/common.sh@353 -- # local d=1 00:06:05.740 08:18:18 rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:05.740 08:18:18 rpc -- scripts/common.sh@355 -- # echo 1 00:06:05.740 08:18:18 rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:05.740 08:18:18 rpc -- scripts/common.sh@366 -- # decimal 2 00:06:05.740 08:18:18 rpc -- scripts/common.sh@353 -- # local d=2 00:06:05.740 08:18:18 rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:05.740 08:18:18 rpc -- scripts/common.sh@355 -- # echo 2 00:06:05.740 08:18:18 rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:05.740 08:18:18 rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:05.740 08:18:18 rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:05.740 08:18:18 rpc -- scripts/common.sh@368 -- # return 0 00:06:05.740 08:18:18 rpc -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:05.740 08:18:18 rpc -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:05.740 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:05.740 --rc genhtml_branch_coverage=1 00:06:05.740 --rc genhtml_function_coverage=1 00:06:05.740 --rc genhtml_legend=1 00:06:05.740 --rc geninfo_all_blocks=1 00:06:05.740 --rc geninfo_unexecuted_blocks=1 00:06:05.740 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:05.740 ' 00:06:05.740 08:18:18 rpc -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:05.740 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:05.740 --rc genhtml_branch_coverage=1 00:06:05.740 --rc genhtml_function_coverage=1 00:06:05.740 --rc genhtml_legend=1 00:06:05.740 --rc geninfo_all_blocks=1 00:06:05.740 --rc geninfo_unexecuted_blocks=1 00:06:05.740 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:05.740 ' 00:06:05.740 08:18:18 rpc -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:05.740 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:05.740 --rc genhtml_branch_coverage=1 00:06:05.740 --rc genhtml_function_coverage=1 00:06:05.740 --rc genhtml_legend=1 00:06:05.740 --rc geninfo_all_blocks=1 00:06:05.740 --rc geninfo_unexecuted_blocks=1 00:06:05.740 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:05.740 ' 00:06:05.740 08:18:18 rpc -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:05.740 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:05.740 --rc genhtml_branch_coverage=1 00:06:05.740 --rc genhtml_function_coverage=1 00:06:05.740 --rc genhtml_legend=1 00:06:05.740 --rc geninfo_all_blocks=1 00:06:05.740 --rc geninfo_unexecuted_blocks=1 00:06:05.740 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:05.740 ' 00:06:05.740 08:18:18 rpc -- rpc/rpc.sh@65 -- # spdk_pid=976187 00:06:05.740 08:18:18 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:05.740 08:18:18 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:05.740 08:18:18 rpc -- rpc/rpc.sh@67 -- # waitforlisten 976187 00:06:05.740 08:18:18 rpc -- common/autotest_common.sh@831 -- # '[' -z 976187 ']' 00:06:05.740 08:18:18 rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:05.740 08:18:18 rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:05.740 08:18:18 rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:05.740 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:05.740 08:18:18 rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:05.740 08:18:18 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:05.740 [2024-11-17 08:18:18.712660] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:05.740 [2024-11-17 08:18:18.712741] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid976187 ] 00:06:05.740 [2024-11-17 08:18:18.779205] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:05.740 [2024-11-17 08:18:18.816954] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:05.740 [2024-11-17 08:18:18.816997] app.c: 614:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 976187' to capture a snapshot of events at runtime. 00:06:05.740 [2024-11-17 08:18:18.817006] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:05.740 [2024-11-17 08:18:18.817014] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:05.740 [2024-11-17 08:18:18.817022] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid976187 for offline analysis/debug. 00:06:05.740 [2024-11-17 08:18:18.817043] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.000 08:18:19 rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:06.000 08:18:19 rpc -- common/autotest_common.sh@864 -- # return 0 00:06:06.000 08:18:19 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:06.000 08:18:19 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:06.000 08:18:19 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:06.000 08:18:19 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:06.000 08:18:19 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:06.000 08:18:19 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:06.000 08:18:19 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:06.000 ************************************ 00:06:06.000 START TEST rpc_integrity 00:06:06.000 ************************************ 00:06:06.000 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:06:06.000 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:06.000 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:06.000 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:06.000 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:06.000 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:06.000 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:06.000 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:06.000 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:06.000 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:06.000 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:06.000 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:06.000 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:06.000 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:06.000 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:06.000 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:06.000 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:06.000 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:06.000 { 00:06:06.000 "name": "Malloc0", 00:06:06.000 "aliases": [ 00:06:06.000 "ca76950d-8e7e-4d04-a92b-b5287391ef3b" 00:06:06.000 ], 00:06:06.000 "product_name": "Malloc disk", 00:06:06.000 "block_size": 512, 00:06:06.000 "num_blocks": 16384, 00:06:06.000 "uuid": "ca76950d-8e7e-4d04-a92b-b5287391ef3b", 00:06:06.000 "assigned_rate_limits": { 00:06:06.000 "rw_ios_per_sec": 0, 00:06:06.000 "rw_mbytes_per_sec": 0, 00:06:06.000 "r_mbytes_per_sec": 0, 00:06:06.000 "w_mbytes_per_sec": 0 00:06:06.000 }, 00:06:06.000 "claimed": false, 00:06:06.000 "zoned": false, 00:06:06.000 "supported_io_types": { 00:06:06.000 "read": true, 00:06:06.000 "write": true, 00:06:06.000 "unmap": true, 00:06:06.000 "flush": true, 00:06:06.000 "reset": true, 00:06:06.000 "nvme_admin": false, 00:06:06.000 "nvme_io": false, 00:06:06.000 "nvme_io_md": false, 00:06:06.000 "write_zeroes": true, 00:06:06.000 "zcopy": true, 00:06:06.000 "get_zone_info": false, 00:06:06.000 "zone_management": false, 00:06:06.000 "zone_append": false, 00:06:06.000 "compare": false, 00:06:06.000 "compare_and_write": false, 00:06:06.000 "abort": true, 00:06:06.000 "seek_hole": false, 00:06:06.000 "seek_data": false, 00:06:06.000 "copy": true, 00:06:06.000 "nvme_iov_md": false 00:06:06.000 }, 00:06:06.000 "memory_domains": [ 00:06:06.000 { 00:06:06.000 "dma_device_id": "system", 00:06:06.000 "dma_device_type": 1 00:06:06.000 }, 00:06:06.000 { 00:06:06.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:06.000 "dma_device_type": 2 00:06:06.000 } 00:06:06.000 ], 00:06:06.000 "driver_specific": {} 00:06:06.000 } 00:06:06.000 ]' 00:06:06.000 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:06.267 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:06.267 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:06.267 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:06.267 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:06.267 [2024-11-17 08:18:19.184271] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:06.267 [2024-11-17 08:18:19.184305] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:06.267 [2024-11-17 08:18:19.184320] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x622c8c0 00:06:06.267 [2024-11-17 08:18:19.184329] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:06.267 [2024-11-17 08:18:19.185196] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:06.267 [2024-11-17 08:18:19.185221] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:06.267 Passthru0 00:06:06.267 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:06.267 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:06.267 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:06.267 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:06.267 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:06.267 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:06.267 { 00:06:06.267 "name": "Malloc0", 00:06:06.267 "aliases": [ 00:06:06.267 "ca76950d-8e7e-4d04-a92b-b5287391ef3b" 00:06:06.267 ], 00:06:06.267 "product_name": "Malloc disk", 00:06:06.267 "block_size": 512, 00:06:06.267 "num_blocks": 16384, 00:06:06.267 "uuid": "ca76950d-8e7e-4d04-a92b-b5287391ef3b", 00:06:06.267 "assigned_rate_limits": { 00:06:06.267 "rw_ios_per_sec": 0, 00:06:06.267 "rw_mbytes_per_sec": 0, 00:06:06.267 "r_mbytes_per_sec": 0, 00:06:06.267 "w_mbytes_per_sec": 0 00:06:06.267 }, 00:06:06.267 "claimed": true, 00:06:06.267 "claim_type": "exclusive_write", 00:06:06.267 "zoned": false, 00:06:06.267 "supported_io_types": { 00:06:06.267 "read": true, 00:06:06.267 "write": true, 00:06:06.267 "unmap": true, 00:06:06.267 "flush": true, 00:06:06.267 "reset": true, 00:06:06.267 "nvme_admin": false, 00:06:06.267 "nvme_io": false, 00:06:06.267 "nvme_io_md": false, 00:06:06.267 "write_zeroes": true, 00:06:06.267 "zcopy": true, 00:06:06.267 "get_zone_info": false, 00:06:06.267 "zone_management": false, 00:06:06.267 "zone_append": false, 00:06:06.267 "compare": false, 00:06:06.267 "compare_and_write": false, 00:06:06.267 "abort": true, 00:06:06.267 "seek_hole": false, 00:06:06.267 "seek_data": false, 00:06:06.267 "copy": true, 00:06:06.267 "nvme_iov_md": false 00:06:06.267 }, 00:06:06.267 "memory_domains": [ 00:06:06.267 { 00:06:06.267 "dma_device_id": "system", 00:06:06.267 "dma_device_type": 1 00:06:06.267 }, 00:06:06.267 { 00:06:06.267 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:06.267 "dma_device_type": 2 00:06:06.267 } 00:06:06.267 ], 00:06:06.267 "driver_specific": {} 00:06:06.267 }, 00:06:06.267 { 00:06:06.267 "name": "Passthru0", 00:06:06.267 "aliases": [ 00:06:06.267 "2996914b-35f5-5780-a3db-b763bdd429a6" 00:06:06.267 ], 00:06:06.267 "product_name": "passthru", 00:06:06.267 "block_size": 512, 00:06:06.267 "num_blocks": 16384, 00:06:06.267 "uuid": "2996914b-35f5-5780-a3db-b763bdd429a6", 00:06:06.267 "assigned_rate_limits": { 00:06:06.267 "rw_ios_per_sec": 0, 00:06:06.267 "rw_mbytes_per_sec": 0, 00:06:06.267 "r_mbytes_per_sec": 0, 00:06:06.267 "w_mbytes_per_sec": 0 00:06:06.267 }, 00:06:06.267 "claimed": false, 00:06:06.267 "zoned": false, 00:06:06.267 "supported_io_types": { 00:06:06.267 "read": true, 00:06:06.267 "write": true, 00:06:06.267 "unmap": true, 00:06:06.267 "flush": true, 00:06:06.267 "reset": true, 00:06:06.267 "nvme_admin": false, 00:06:06.267 "nvme_io": false, 00:06:06.267 "nvme_io_md": false, 00:06:06.267 "write_zeroes": true, 00:06:06.267 "zcopy": true, 00:06:06.267 "get_zone_info": false, 00:06:06.267 "zone_management": false, 00:06:06.267 "zone_append": false, 00:06:06.267 "compare": false, 00:06:06.267 "compare_and_write": false, 00:06:06.267 "abort": true, 00:06:06.267 "seek_hole": false, 00:06:06.267 "seek_data": false, 00:06:06.267 "copy": true, 00:06:06.267 "nvme_iov_md": false 00:06:06.267 }, 00:06:06.267 "memory_domains": [ 00:06:06.267 { 00:06:06.267 "dma_device_id": "system", 00:06:06.267 "dma_device_type": 1 00:06:06.267 }, 00:06:06.267 { 00:06:06.267 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:06.267 "dma_device_type": 2 00:06:06.267 } 00:06:06.267 ], 00:06:06.267 "driver_specific": { 00:06:06.267 "passthru": { 00:06:06.267 "name": "Passthru0", 00:06:06.267 "base_bdev_name": "Malloc0" 00:06:06.267 } 00:06:06.267 } 00:06:06.267 } 00:06:06.267 ]' 00:06:06.267 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:06.267 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:06.267 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:06.267 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:06.267 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:06.267 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:06.267 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:06.267 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:06.267 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:06.267 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:06.267 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:06.267 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:06.267 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:06.267 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:06.268 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:06.268 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:06.268 08:18:19 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:06.268 00:06:06.268 real 0m0.279s 00:06:06.268 user 0m0.172s 00:06:06.268 sys 0m0.052s 00:06:06.268 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:06.268 08:18:19 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:06.268 ************************************ 00:06:06.268 END TEST rpc_integrity 00:06:06.268 ************************************ 00:06:06.268 08:18:19 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:06.268 08:18:19 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:06.268 08:18:19 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:06.268 08:18:19 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:06.268 ************************************ 00:06:06.268 START TEST rpc_plugins 00:06:06.268 ************************************ 00:06:06.526 08:18:19 rpc.rpc_plugins -- common/autotest_common.sh@1125 -- # rpc_plugins 00:06:06.526 08:18:19 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:06.526 08:18:19 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:06.526 08:18:19 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:06.526 08:18:19 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:06.526 08:18:19 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:06.526 08:18:19 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:06.526 08:18:19 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:06.526 08:18:19 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:06.526 08:18:19 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:06.526 08:18:19 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:06.526 { 00:06:06.526 "name": "Malloc1", 00:06:06.526 "aliases": [ 00:06:06.526 "e7754069-8ccf-4ea0-91a3-fbadeee71d07" 00:06:06.526 ], 00:06:06.526 "product_name": "Malloc disk", 00:06:06.526 "block_size": 4096, 00:06:06.526 "num_blocks": 256, 00:06:06.526 "uuid": "e7754069-8ccf-4ea0-91a3-fbadeee71d07", 00:06:06.526 "assigned_rate_limits": { 00:06:06.526 "rw_ios_per_sec": 0, 00:06:06.526 "rw_mbytes_per_sec": 0, 00:06:06.526 "r_mbytes_per_sec": 0, 00:06:06.526 "w_mbytes_per_sec": 0 00:06:06.526 }, 00:06:06.526 "claimed": false, 00:06:06.526 "zoned": false, 00:06:06.526 "supported_io_types": { 00:06:06.526 "read": true, 00:06:06.526 "write": true, 00:06:06.526 "unmap": true, 00:06:06.526 "flush": true, 00:06:06.526 "reset": true, 00:06:06.526 "nvme_admin": false, 00:06:06.526 "nvme_io": false, 00:06:06.526 "nvme_io_md": false, 00:06:06.526 "write_zeroes": true, 00:06:06.526 "zcopy": true, 00:06:06.526 "get_zone_info": false, 00:06:06.526 "zone_management": false, 00:06:06.526 "zone_append": false, 00:06:06.526 "compare": false, 00:06:06.526 "compare_and_write": false, 00:06:06.526 "abort": true, 00:06:06.526 "seek_hole": false, 00:06:06.526 "seek_data": false, 00:06:06.526 "copy": true, 00:06:06.526 "nvme_iov_md": false 00:06:06.526 }, 00:06:06.526 "memory_domains": [ 00:06:06.526 { 00:06:06.526 "dma_device_id": "system", 00:06:06.526 "dma_device_type": 1 00:06:06.526 }, 00:06:06.526 { 00:06:06.526 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:06.526 "dma_device_type": 2 00:06:06.526 } 00:06:06.526 ], 00:06:06.526 "driver_specific": {} 00:06:06.526 } 00:06:06.526 ]' 00:06:06.526 08:18:19 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:06.526 08:18:19 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:06.526 08:18:19 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:06.526 08:18:19 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:06.526 08:18:19 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:06.526 08:18:19 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:06.526 08:18:19 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:06.526 08:18:19 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:06.526 08:18:19 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:06.526 08:18:19 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:06.526 08:18:19 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:06.526 08:18:19 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:06.526 08:18:19 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:06.526 00:06:06.526 real 0m0.129s 00:06:06.526 user 0m0.077s 00:06:06.526 sys 0m0.021s 00:06:06.526 08:18:19 rpc.rpc_plugins -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:06.526 08:18:19 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:06.526 ************************************ 00:06:06.526 END TEST rpc_plugins 00:06:06.526 ************************************ 00:06:06.526 08:18:19 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:06.526 08:18:19 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:06.526 08:18:19 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:06.526 08:18:19 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:06.526 ************************************ 00:06:06.526 START TEST rpc_trace_cmd_test 00:06:06.526 ************************************ 00:06:06.526 08:18:19 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1125 -- # rpc_trace_cmd_test 00:06:06.526 08:18:19 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:06.526 08:18:19 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:06.526 08:18:19 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:06.526 08:18:19 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:06.526 08:18:19 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:06.526 08:18:19 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:06.526 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid976187", 00:06:06.526 "tpoint_group_mask": "0x8", 00:06:06.526 "iscsi_conn": { 00:06:06.526 "mask": "0x2", 00:06:06.526 "tpoint_mask": "0x0" 00:06:06.526 }, 00:06:06.526 "scsi": { 00:06:06.526 "mask": "0x4", 00:06:06.526 "tpoint_mask": "0x0" 00:06:06.526 }, 00:06:06.526 "bdev": { 00:06:06.526 "mask": "0x8", 00:06:06.526 "tpoint_mask": "0xffffffffffffffff" 00:06:06.526 }, 00:06:06.526 "nvmf_rdma": { 00:06:06.526 "mask": "0x10", 00:06:06.526 "tpoint_mask": "0x0" 00:06:06.526 }, 00:06:06.526 "nvmf_tcp": { 00:06:06.526 "mask": "0x20", 00:06:06.526 "tpoint_mask": "0x0" 00:06:06.526 }, 00:06:06.526 "ftl": { 00:06:06.526 "mask": "0x40", 00:06:06.526 "tpoint_mask": "0x0" 00:06:06.526 }, 00:06:06.526 "blobfs": { 00:06:06.526 "mask": "0x80", 00:06:06.526 "tpoint_mask": "0x0" 00:06:06.526 }, 00:06:06.526 "dsa": { 00:06:06.526 "mask": "0x200", 00:06:06.526 "tpoint_mask": "0x0" 00:06:06.526 }, 00:06:06.526 "thread": { 00:06:06.526 "mask": "0x400", 00:06:06.526 "tpoint_mask": "0x0" 00:06:06.526 }, 00:06:06.526 "nvme_pcie": { 00:06:06.526 "mask": "0x800", 00:06:06.526 "tpoint_mask": "0x0" 00:06:06.526 }, 00:06:06.526 "iaa": { 00:06:06.526 "mask": "0x1000", 00:06:06.526 "tpoint_mask": "0x0" 00:06:06.526 }, 00:06:06.526 "nvme_tcp": { 00:06:06.526 "mask": "0x2000", 00:06:06.526 "tpoint_mask": "0x0" 00:06:06.526 }, 00:06:06.526 "bdev_nvme": { 00:06:06.526 "mask": "0x4000", 00:06:06.526 "tpoint_mask": "0x0" 00:06:06.526 }, 00:06:06.526 "sock": { 00:06:06.526 "mask": "0x8000", 00:06:06.526 "tpoint_mask": "0x0" 00:06:06.526 }, 00:06:06.526 "blob": { 00:06:06.526 "mask": "0x10000", 00:06:06.526 "tpoint_mask": "0x0" 00:06:06.526 }, 00:06:06.526 "bdev_raid": { 00:06:06.526 "mask": "0x20000", 00:06:06.526 "tpoint_mask": "0x0" 00:06:06.526 } 00:06:06.526 }' 00:06:06.526 08:18:19 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:06.785 08:18:19 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 18 -gt 2 ']' 00:06:06.785 08:18:19 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:06.785 08:18:19 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:06.785 08:18:19 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:06.785 08:18:19 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:06.785 08:18:19 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:06.785 08:18:19 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:06.785 08:18:19 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:06.785 08:18:19 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:06.785 00:06:06.785 real 0m0.227s 00:06:06.785 user 0m0.190s 00:06:06.785 sys 0m0.028s 00:06:06.785 08:18:19 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:06.785 08:18:19 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:06.785 ************************************ 00:06:06.785 END TEST rpc_trace_cmd_test 00:06:06.785 ************************************ 00:06:06.785 08:18:19 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:06.785 08:18:19 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:06.785 08:18:19 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:06.785 08:18:19 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:06.785 08:18:19 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:06.785 08:18:19 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:06.785 ************************************ 00:06:06.785 START TEST rpc_daemon_integrity 00:06:06.785 ************************************ 00:06:06.785 08:18:19 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:06:06.785 08:18:19 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:06.785 08:18:19 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:06.785 08:18:19 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:07.044 08:18:19 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:07.044 08:18:19 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:07.044 08:18:19 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:07.044 08:18:19 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:07.044 08:18:19 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:07.044 08:18:19 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:07.044 08:18:19 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:07.044 08:18:19 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:07.044 08:18:19 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:07.044 08:18:19 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:07.044 08:18:19 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:07.044 08:18:19 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:07.044 08:18:20 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:07.044 08:18:20 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:07.044 { 00:06:07.044 "name": "Malloc2", 00:06:07.044 "aliases": [ 00:06:07.044 "e04c6176-ba2a-4b35-9870-041a154fcf3e" 00:06:07.044 ], 00:06:07.044 "product_name": "Malloc disk", 00:06:07.044 "block_size": 512, 00:06:07.044 "num_blocks": 16384, 00:06:07.044 "uuid": "e04c6176-ba2a-4b35-9870-041a154fcf3e", 00:06:07.044 "assigned_rate_limits": { 00:06:07.044 "rw_ios_per_sec": 0, 00:06:07.044 "rw_mbytes_per_sec": 0, 00:06:07.044 "r_mbytes_per_sec": 0, 00:06:07.044 "w_mbytes_per_sec": 0 00:06:07.044 }, 00:06:07.044 "claimed": false, 00:06:07.044 "zoned": false, 00:06:07.044 "supported_io_types": { 00:06:07.044 "read": true, 00:06:07.044 "write": true, 00:06:07.044 "unmap": true, 00:06:07.044 "flush": true, 00:06:07.044 "reset": true, 00:06:07.044 "nvme_admin": false, 00:06:07.044 "nvme_io": false, 00:06:07.044 "nvme_io_md": false, 00:06:07.044 "write_zeroes": true, 00:06:07.044 "zcopy": true, 00:06:07.044 "get_zone_info": false, 00:06:07.044 "zone_management": false, 00:06:07.044 "zone_append": false, 00:06:07.044 "compare": false, 00:06:07.044 "compare_and_write": false, 00:06:07.044 "abort": true, 00:06:07.044 "seek_hole": false, 00:06:07.044 "seek_data": false, 00:06:07.044 "copy": true, 00:06:07.044 "nvme_iov_md": false 00:06:07.044 }, 00:06:07.044 "memory_domains": [ 00:06:07.044 { 00:06:07.044 "dma_device_id": "system", 00:06:07.044 "dma_device_type": 1 00:06:07.044 }, 00:06:07.044 { 00:06:07.044 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:07.044 "dma_device_type": 2 00:06:07.044 } 00:06:07.044 ], 00:06:07.044 "driver_specific": {} 00:06:07.044 } 00:06:07.044 ]' 00:06:07.044 08:18:20 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:07.044 08:18:20 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:07.045 [2024-11-17 08:18:20.054523] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:07.045 [2024-11-17 08:18:20.054560] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:07.045 [2024-11-17 08:18:20.054580] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x634e060 00:06:07.045 [2024-11-17 08:18:20.054589] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:07.045 [2024-11-17 08:18:20.055413] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:07.045 [2024-11-17 08:18:20.055436] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:07.045 Passthru0 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:07.045 { 00:06:07.045 "name": "Malloc2", 00:06:07.045 "aliases": [ 00:06:07.045 "e04c6176-ba2a-4b35-9870-041a154fcf3e" 00:06:07.045 ], 00:06:07.045 "product_name": "Malloc disk", 00:06:07.045 "block_size": 512, 00:06:07.045 "num_blocks": 16384, 00:06:07.045 "uuid": "e04c6176-ba2a-4b35-9870-041a154fcf3e", 00:06:07.045 "assigned_rate_limits": { 00:06:07.045 "rw_ios_per_sec": 0, 00:06:07.045 "rw_mbytes_per_sec": 0, 00:06:07.045 "r_mbytes_per_sec": 0, 00:06:07.045 "w_mbytes_per_sec": 0 00:06:07.045 }, 00:06:07.045 "claimed": true, 00:06:07.045 "claim_type": "exclusive_write", 00:06:07.045 "zoned": false, 00:06:07.045 "supported_io_types": { 00:06:07.045 "read": true, 00:06:07.045 "write": true, 00:06:07.045 "unmap": true, 00:06:07.045 "flush": true, 00:06:07.045 "reset": true, 00:06:07.045 "nvme_admin": false, 00:06:07.045 "nvme_io": false, 00:06:07.045 "nvme_io_md": false, 00:06:07.045 "write_zeroes": true, 00:06:07.045 "zcopy": true, 00:06:07.045 "get_zone_info": false, 00:06:07.045 "zone_management": false, 00:06:07.045 "zone_append": false, 00:06:07.045 "compare": false, 00:06:07.045 "compare_and_write": false, 00:06:07.045 "abort": true, 00:06:07.045 "seek_hole": false, 00:06:07.045 "seek_data": false, 00:06:07.045 "copy": true, 00:06:07.045 "nvme_iov_md": false 00:06:07.045 }, 00:06:07.045 "memory_domains": [ 00:06:07.045 { 00:06:07.045 "dma_device_id": "system", 00:06:07.045 "dma_device_type": 1 00:06:07.045 }, 00:06:07.045 { 00:06:07.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:07.045 "dma_device_type": 2 00:06:07.045 } 00:06:07.045 ], 00:06:07.045 "driver_specific": {} 00:06:07.045 }, 00:06:07.045 { 00:06:07.045 "name": "Passthru0", 00:06:07.045 "aliases": [ 00:06:07.045 "63d2462a-b70d-5a79-bd5d-b606cad1f355" 00:06:07.045 ], 00:06:07.045 "product_name": "passthru", 00:06:07.045 "block_size": 512, 00:06:07.045 "num_blocks": 16384, 00:06:07.045 "uuid": "63d2462a-b70d-5a79-bd5d-b606cad1f355", 00:06:07.045 "assigned_rate_limits": { 00:06:07.045 "rw_ios_per_sec": 0, 00:06:07.045 "rw_mbytes_per_sec": 0, 00:06:07.045 "r_mbytes_per_sec": 0, 00:06:07.045 "w_mbytes_per_sec": 0 00:06:07.045 }, 00:06:07.045 "claimed": false, 00:06:07.045 "zoned": false, 00:06:07.045 "supported_io_types": { 00:06:07.045 "read": true, 00:06:07.045 "write": true, 00:06:07.045 "unmap": true, 00:06:07.045 "flush": true, 00:06:07.045 "reset": true, 00:06:07.045 "nvme_admin": false, 00:06:07.045 "nvme_io": false, 00:06:07.045 "nvme_io_md": false, 00:06:07.045 "write_zeroes": true, 00:06:07.045 "zcopy": true, 00:06:07.045 "get_zone_info": false, 00:06:07.045 "zone_management": false, 00:06:07.045 "zone_append": false, 00:06:07.045 "compare": false, 00:06:07.045 "compare_and_write": false, 00:06:07.045 "abort": true, 00:06:07.045 "seek_hole": false, 00:06:07.045 "seek_data": false, 00:06:07.045 "copy": true, 00:06:07.045 "nvme_iov_md": false 00:06:07.045 }, 00:06:07.045 "memory_domains": [ 00:06:07.045 { 00:06:07.045 "dma_device_id": "system", 00:06:07.045 "dma_device_type": 1 00:06:07.045 }, 00:06:07.045 { 00:06:07.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:07.045 "dma_device_type": 2 00:06:07.045 } 00:06:07.045 ], 00:06:07.045 "driver_specific": { 00:06:07.045 "passthru": { 00:06:07.045 "name": "Passthru0", 00:06:07.045 "base_bdev_name": "Malloc2" 00:06:07.045 } 00:06:07.045 } 00:06:07.045 } 00:06:07.045 ]' 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:07.045 08:18:20 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:07.304 08:18:20 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:07.304 00:06:07.304 real 0m0.283s 00:06:07.304 user 0m0.186s 00:06:07.304 sys 0m0.048s 00:06:07.304 08:18:20 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:07.304 08:18:20 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:07.304 ************************************ 00:06:07.304 END TEST rpc_daemon_integrity 00:06:07.304 ************************************ 00:06:07.304 08:18:20 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:07.304 08:18:20 rpc -- rpc/rpc.sh@84 -- # killprocess 976187 00:06:07.304 08:18:20 rpc -- common/autotest_common.sh@950 -- # '[' -z 976187 ']' 00:06:07.304 08:18:20 rpc -- common/autotest_common.sh@954 -- # kill -0 976187 00:06:07.304 08:18:20 rpc -- common/autotest_common.sh@955 -- # uname 00:06:07.304 08:18:20 rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:07.304 08:18:20 rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 976187 00:06:07.304 08:18:20 rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:07.304 08:18:20 rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:07.304 08:18:20 rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 976187' 00:06:07.304 killing process with pid 976187 00:06:07.304 08:18:20 rpc -- common/autotest_common.sh@969 -- # kill 976187 00:06:07.304 08:18:20 rpc -- common/autotest_common.sh@974 -- # wait 976187 00:06:07.564 00:06:07.564 real 0m2.117s 00:06:07.564 user 0m2.677s 00:06:07.564 sys 0m0.798s 00:06:07.564 08:18:20 rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:07.564 08:18:20 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:07.564 ************************************ 00:06:07.564 END TEST rpc 00:06:07.564 ************************************ 00:06:07.564 08:18:20 -- spdk/autotest.sh@157 -- # run_test skip_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:07.564 08:18:20 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:07.564 08:18:20 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:07.564 08:18:20 -- common/autotest_common.sh@10 -- # set +x 00:06:07.564 ************************************ 00:06:07.564 START TEST skip_rpc 00:06:07.564 ************************************ 00:06:07.564 08:18:20 skip_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:07.823 * Looking for test storage... 00:06:07.823 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:07.823 08:18:20 skip_rpc -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:07.823 08:18:20 skip_rpc -- common/autotest_common.sh@1681 -- # lcov --version 00:06:07.823 08:18:20 skip_rpc -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:07.823 08:18:20 skip_rpc -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:07.823 08:18:20 skip_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:07.823 08:18:20 skip_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:07.823 08:18:20 skip_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:07.823 08:18:20 skip_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:07.823 08:18:20 skip_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:07.823 08:18:20 skip_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:07.823 08:18:20 skip_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:07.823 08:18:20 skip_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:07.823 08:18:20 skip_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:07.823 08:18:20 skip_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:07.823 08:18:20 skip_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:07.823 08:18:20 skip_rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:07.824 08:18:20 skip_rpc -- scripts/common.sh@345 -- # : 1 00:06:07.824 08:18:20 skip_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:07.824 08:18:20 skip_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:07.824 08:18:20 skip_rpc -- scripts/common.sh@365 -- # decimal 1 00:06:07.824 08:18:20 skip_rpc -- scripts/common.sh@353 -- # local d=1 00:06:07.824 08:18:20 skip_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:07.824 08:18:20 skip_rpc -- scripts/common.sh@355 -- # echo 1 00:06:07.824 08:18:20 skip_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:07.824 08:18:20 skip_rpc -- scripts/common.sh@366 -- # decimal 2 00:06:07.824 08:18:20 skip_rpc -- scripts/common.sh@353 -- # local d=2 00:06:07.824 08:18:20 skip_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:07.824 08:18:20 skip_rpc -- scripts/common.sh@355 -- # echo 2 00:06:07.824 08:18:20 skip_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:07.824 08:18:20 skip_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:07.824 08:18:20 skip_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:07.824 08:18:20 skip_rpc -- scripts/common.sh@368 -- # return 0 00:06:07.824 08:18:20 skip_rpc -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:07.824 08:18:20 skip_rpc -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:07.824 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:07.824 --rc genhtml_branch_coverage=1 00:06:07.824 --rc genhtml_function_coverage=1 00:06:07.824 --rc genhtml_legend=1 00:06:07.824 --rc geninfo_all_blocks=1 00:06:07.824 --rc geninfo_unexecuted_blocks=1 00:06:07.824 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:07.824 ' 00:06:07.824 08:18:20 skip_rpc -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:07.824 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:07.824 --rc genhtml_branch_coverage=1 00:06:07.824 --rc genhtml_function_coverage=1 00:06:07.824 --rc genhtml_legend=1 00:06:07.824 --rc geninfo_all_blocks=1 00:06:07.824 --rc geninfo_unexecuted_blocks=1 00:06:07.824 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:07.824 ' 00:06:07.824 08:18:20 skip_rpc -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:07.824 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:07.824 --rc genhtml_branch_coverage=1 00:06:07.824 --rc genhtml_function_coverage=1 00:06:07.824 --rc genhtml_legend=1 00:06:07.824 --rc geninfo_all_blocks=1 00:06:07.824 --rc geninfo_unexecuted_blocks=1 00:06:07.824 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:07.824 ' 00:06:07.824 08:18:20 skip_rpc -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:07.824 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:07.824 --rc genhtml_branch_coverage=1 00:06:07.824 --rc genhtml_function_coverage=1 00:06:07.824 --rc genhtml_legend=1 00:06:07.824 --rc geninfo_all_blocks=1 00:06:07.824 --rc geninfo_unexecuted_blocks=1 00:06:07.824 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:07.824 ' 00:06:07.824 08:18:20 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:07.824 08:18:20 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:07.824 08:18:20 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:07.824 08:18:20 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:07.824 08:18:20 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:07.824 08:18:20 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:07.824 ************************************ 00:06:07.824 START TEST skip_rpc 00:06:07.824 ************************************ 00:06:07.824 08:18:20 skip_rpc.skip_rpc -- common/autotest_common.sh@1125 -- # test_skip_rpc 00:06:07.824 08:18:20 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=976644 00:06:07.824 08:18:20 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:07.824 08:18:20 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:07.824 08:18:20 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:07.824 [2024-11-17 08:18:20.948363] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:07.824 [2024-11-17 08:18:20.948419] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid976644 ] 00:06:08.084 [2024-11-17 08:18:21.014998] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:08.084 [2024-11-17 08:18:21.052682] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # local es=0 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # rpc_cmd spdk_get_version 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # es=1 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 976644 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- common/autotest_common.sh@950 -- # '[' -z 976644 ']' 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # kill -0 976644 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # uname 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:13.359 08:18:25 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 976644 00:06:13.359 08:18:26 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:13.359 08:18:26 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:13.359 08:18:26 skip_rpc.skip_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 976644' 00:06:13.359 killing process with pid 976644 00:06:13.359 08:18:26 skip_rpc.skip_rpc -- common/autotest_common.sh@969 -- # kill 976644 00:06:13.359 08:18:26 skip_rpc.skip_rpc -- common/autotest_common.sh@974 -- # wait 976644 00:06:13.359 00:06:13.359 real 0m5.391s 00:06:13.359 user 0m5.148s 00:06:13.359 sys 0m0.295s 00:06:13.359 08:18:26 skip_rpc.skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:13.359 08:18:26 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:13.359 ************************************ 00:06:13.359 END TEST skip_rpc 00:06:13.359 ************************************ 00:06:13.359 08:18:26 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:13.359 08:18:26 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:13.359 08:18:26 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:13.359 08:18:26 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:13.359 ************************************ 00:06:13.359 START TEST skip_rpc_with_json 00:06:13.359 ************************************ 00:06:13.359 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_json 00:06:13.359 08:18:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:13.359 08:18:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=977729 00:06:13.360 08:18:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:13.360 08:18:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:13.360 08:18:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 977729 00:06:13.360 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@831 -- # '[' -z 977729 ']' 00:06:13.360 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:13.360 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:13.360 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:13.360 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:13.360 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:13.360 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:13.360 [2024-11-17 08:18:26.426254] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:13.360 [2024-11-17 08:18:26.426312] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid977729 ] 00:06:13.360 [2024-11-17 08:18:26.493042] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:13.619 [2024-11-17 08:18:26.532520] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.619 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:13.619 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # return 0 00:06:13.619 08:18:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:13.619 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:13.619 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:13.619 [2024-11-17 08:18:26.727867] nvmf_rpc.c:2703:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:13.619 request: 00:06:13.619 { 00:06:13.619 "trtype": "tcp", 00:06:13.619 "method": "nvmf_get_transports", 00:06:13.619 "req_id": 1 00:06:13.619 } 00:06:13.619 Got JSON-RPC error response 00:06:13.619 response: 00:06:13.619 { 00:06:13.619 "code": -19, 00:06:13.619 "message": "No such device" 00:06:13.619 } 00:06:13.619 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:13.619 08:18:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:13.619 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:13.619 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:13.619 [2024-11-17 08:18:26.739965] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:13.619 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:13.619 08:18:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:13.619 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:13.619 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:13.879 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:13.879 08:18:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:13.879 { 00:06:13.879 "subsystems": [ 00:06:13.879 { 00:06:13.879 "subsystem": "scheduler", 00:06:13.879 "config": [ 00:06:13.879 { 00:06:13.879 "method": "framework_set_scheduler", 00:06:13.879 "params": { 00:06:13.879 "name": "static" 00:06:13.879 } 00:06:13.879 } 00:06:13.879 ] 00:06:13.879 }, 00:06:13.879 { 00:06:13.879 "subsystem": "vmd", 00:06:13.879 "config": [] 00:06:13.879 }, 00:06:13.879 { 00:06:13.879 "subsystem": "sock", 00:06:13.879 "config": [ 00:06:13.879 { 00:06:13.879 "method": "sock_set_default_impl", 00:06:13.879 "params": { 00:06:13.879 "impl_name": "posix" 00:06:13.879 } 00:06:13.879 }, 00:06:13.879 { 00:06:13.879 "method": "sock_impl_set_options", 00:06:13.879 "params": { 00:06:13.879 "impl_name": "ssl", 00:06:13.879 "recv_buf_size": 4096, 00:06:13.879 "send_buf_size": 4096, 00:06:13.879 "enable_recv_pipe": true, 00:06:13.879 "enable_quickack": false, 00:06:13.879 "enable_placement_id": 0, 00:06:13.879 "enable_zerocopy_send_server": true, 00:06:13.879 "enable_zerocopy_send_client": false, 00:06:13.879 "zerocopy_threshold": 0, 00:06:13.879 "tls_version": 0, 00:06:13.879 "enable_ktls": false 00:06:13.879 } 00:06:13.879 }, 00:06:13.879 { 00:06:13.879 "method": "sock_impl_set_options", 00:06:13.879 "params": { 00:06:13.879 "impl_name": "posix", 00:06:13.879 "recv_buf_size": 2097152, 00:06:13.879 "send_buf_size": 2097152, 00:06:13.879 "enable_recv_pipe": true, 00:06:13.879 "enable_quickack": false, 00:06:13.879 "enable_placement_id": 0, 00:06:13.879 "enable_zerocopy_send_server": true, 00:06:13.879 "enable_zerocopy_send_client": false, 00:06:13.879 "zerocopy_threshold": 0, 00:06:13.879 "tls_version": 0, 00:06:13.879 "enable_ktls": false 00:06:13.879 } 00:06:13.879 } 00:06:13.879 ] 00:06:13.879 }, 00:06:13.879 { 00:06:13.879 "subsystem": "iobuf", 00:06:13.879 "config": [ 00:06:13.879 { 00:06:13.879 "method": "iobuf_set_options", 00:06:13.879 "params": { 00:06:13.879 "small_pool_count": 8192, 00:06:13.879 "large_pool_count": 1024, 00:06:13.879 "small_bufsize": 8192, 00:06:13.879 "large_bufsize": 135168 00:06:13.879 } 00:06:13.879 } 00:06:13.879 ] 00:06:13.879 }, 00:06:13.879 { 00:06:13.879 "subsystem": "keyring", 00:06:13.879 "config": [] 00:06:13.879 }, 00:06:13.879 { 00:06:13.879 "subsystem": "vfio_user_target", 00:06:13.879 "config": null 00:06:13.879 }, 00:06:13.879 { 00:06:13.879 "subsystem": "fsdev", 00:06:13.879 "config": [ 00:06:13.879 { 00:06:13.879 "method": "fsdev_set_opts", 00:06:13.879 "params": { 00:06:13.879 "fsdev_io_pool_size": 65535, 00:06:13.879 "fsdev_io_cache_size": 256 00:06:13.879 } 00:06:13.879 } 00:06:13.879 ] 00:06:13.879 }, 00:06:13.879 { 00:06:13.879 "subsystem": "accel", 00:06:13.879 "config": [ 00:06:13.879 { 00:06:13.879 "method": "accel_set_options", 00:06:13.879 "params": { 00:06:13.879 "small_cache_size": 128, 00:06:13.879 "large_cache_size": 16, 00:06:13.879 "task_count": 2048, 00:06:13.879 "sequence_count": 2048, 00:06:13.879 "buf_count": 2048 00:06:13.879 } 00:06:13.879 } 00:06:13.879 ] 00:06:13.879 }, 00:06:13.879 { 00:06:13.879 "subsystem": "bdev", 00:06:13.879 "config": [ 00:06:13.879 { 00:06:13.879 "method": "bdev_set_options", 00:06:13.879 "params": { 00:06:13.879 "bdev_io_pool_size": 65535, 00:06:13.879 "bdev_io_cache_size": 256, 00:06:13.879 "bdev_auto_examine": true, 00:06:13.879 "iobuf_small_cache_size": 128, 00:06:13.879 "iobuf_large_cache_size": 16 00:06:13.879 } 00:06:13.879 }, 00:06:13.879 { 00:06:13.879 "method": "bdev_raid_set_options", 00:06:13.879 "params": { 00:06:13.879 "process_window_size_kb": 1024, 00:06:13.879 "process_max_bandwidth_mb_sec": 0 00:06:13.879 } 00:06:13.879 }, 00:06:13.879 { 00:06:13.879 "method": "bdev_nvme_set_options", 00:06:13.879 "params": { 00:06:13.879 "action_on_timeout": "none", 00:06:13.879 "timeout_us": 0, 00:06:13.879 "timeout_admin_us": 0, 00:06:13.879 "keep_alive_timeout_ms": 10000, 00:06:13.879 "arbitration_burst": 0, 00:06:13.879 "low_priority_weight": 0, 00:06:13.879 "medium_priority_weight": 0, 00:06:13.879 "high_priority_weight": 0, 00:06:13.879 "nvme_adminq_poll_period_us": 10000, 00:06:13.879 "nvme_ioq_poll_period_us": 0, 00:06:13.879 "io_queue_requests": 0, 00:06:13.879 "delay_cmd_submit": true, 00:06:13.879 "transport_retry_count": 4, 00:06:13.879 "bdev_retry_count": 3, 00:06:13.879 "transport_ack_timeout": 0, 00:06:13.879 "ctrlr_loss_timeout_sec": 0, 00:06:13.879 "reconnect_delay_sec": 0, 00:06:13.879 "fast_io_fail_timeout_sec": 0, 00:06:13.879 "disable_auto_failback": false, 00:06:13.879 "generate_uuids": false, 00:06:13.879 "transport_tos": 0, 00:06:13.879 "nvme_error_stat": false, 00:06:13.879 "rdma_srq_size": 0, 00:06:13.879 "io_path_stat": false, 00:06:13.879 "allow_accel_sequence": false, 00:06:13.879 "rdma_max_cq_size": 0, 00:06:13.879 "rdma_cm_event_timeout_ms": 0, 00:06:13.879 "dhchap_digests": [ 00:06:13.879 "sha256", 00:06:13.879 "sha384", 00:06:13.879 "sha512" 00:06:13.879 ], 00:06:13.879 "dhchap_dhgroups": [ 00:06:13.879 "null", 00:06:13.879 "ffdhe2048", 00:06:13.879 "ffdhe3072", 00:06:13.879 "ffdhe4096", 00:06:13.879 "ffdhe6144", 00:06:13.879 "ffdhe8192" 00:06:13.879 ] 00:06:13.879 } 00:06:13.879 }, 00:06:13.879 { 00:06:13.879 "method": "bdev_nvme_set_hotplug", 00:06:13.879 "params": { 00:06:13.879 "period_us": 100000, 00:06:13.879 "enable": false 00:06:13.879 } 00:06:13.879 }, 00:06:13.879 { 00:06:13.879 "method": "bdev_iscsi_set_options", 00:06:13.879 "params": { 00:06:13.879 "timeout_sec": 30 00:06:13.879 } 00:06:13.879 }, 00:06:13.879 { 00:06:13.879 "method": "bdev_wait_for_examine" 00:06:13.879 } 00:06:13.879 ] 00:06:13.879 }, 00:06:13.879 { 00:06:13.879 "subsystem": "nvmf", 00:06:13.879 "config": [ 00:06:13.879 { 00:06:13.879 "method": "nvmf_set_config", 00:06:13.879 "params": { 00:06:13.879 "discovery_filter": "match_any", 00:06:13.879 "admin_cmd_passthru": { 00:06:13.879 "identify_ctrlr": false 00:06:13.879 }, 00:06:13.879 "dhchap_digests": [ 00:06:13.879 "sha256", 00:06:13.879 "sha384", 00:06:13.879 "sha512" 00:06:13.879 ], 00:06:13.879 "dhchap_dhgroups": [ 00:06:13.879 "null", 00:06:13.879 "ffdhe2048", 00:06:13.879 "ffdhe3072", 00:06:13.879 "ffdhe4096", 00:06:13.879 "ffdhe6144", 00:06:13.879 "ffdhe8192" 00:06:13.879 ] 00:06:13.879 } 00:06:13.879 }, 00:06:13.879 { 00:06:13.879 "method": "nvmf_set_max_subsystems", 00:06:13.879 "params": { 00:06:13.879 "max_subsystems": 1024 00:06:13.879 } 00:06:13.879 }, 00:06:13.879 { 00:06:13.879 "method": "nvmf_set_crdt", 00:06:13.879 "params": { 00:06:13.879 "crdt1": 0, 00:06:13.879 "crdt2": 0, 00:06:13.879 "crdt3": 0 00:06:13.879 } 00:06:13.879 }, 00:06:13.879 { 00:06:13.879 "method": "nvmf_create_transport", 00:06:13.879 "params": { 00:06:13.879 "trtype": "TCP", 00:06:13.879 "max_queue_depth": 128, 00:06:13.879 "max_io_qpairs_per_ctrlr": 127, 00:06:13.879 "in_capsule_data_size": 4096, 00:06:13.879 "max_io_size": 131072, 00:06:13.879 "io_unit_size": 131072, 00:06:13.879 "max_aq_depth": 128, 00:06:13.879 "num_shared_buffers": 511, 00:06:13.879 "buf_cache_size": 4294967295, 00:06:13.879 "dif_insert_or_strip": false, 00:06:13.880 "zcopy": false, 00:06:13.880 "c2h_success": true, 00:06:13.880 "sock_priority": 0, 00:06:13.880 "abort_timeout_sec": 1, 00:06:13.880 "ack_timeout": 0, 00:06:13.880 "data_wr_pool_size": 0 00:06:13.880 } 00:06:13.880 } 00:06:13.880 ] 00:06:13.880 }, 00:06:13.880 { 00:06:13.880 "subsystem": "nbd", 00:06:13.880 "config": [] 00:06:13.880 }, 00:06:13.880 { 00:06:13.880 "subsystem": "ublk", 00:06:13.880 "config": [] 00:06:13.880 }, 00:06:13.880 { 00:06:13.880 "subsystem": "vhost_blk", 00:06:13.880 "config": [] 00:06:13.880 }, 00:06:13.880 { 00:06:13.880 "subsystem": "scsi", 00:06:13.880 "config": null 00:06:13.880 }, 00:06:13.880 { 00:06:13.880 "subsystem": "iscsi", 00:06:13.880 "config": [ 00:06:13.880 { 00:06:13.880 "method": "iscsi_set_options", 00:06:13.880 "params": { 00:06:13.880 "node_base": "iqn.2016-06.io.spdk", 00:06:13.880 "max_sessions": 128, 00:06:13.880 "max_connections_per_session": 2, 00:06:13.880 "max_queue_depth": 64, 00:06:13.880 "default_time2wait": 2, 00:06:13.880 "default_time2retain": 20, 00:06:13.880 "first_burst_length": 8192, 00:06:13.880 "immediate_data": true, 00:06:13.880 "allow_duplicated_isid": false, 00:06:13.880 "error_recovery_level": 0, 00:06:13.880 "nop_timeout": 60, 00:06:13.880 "nop_in_interval": 30, 00:06:13.880 "disable_chap": false, 00:06:13.880 "require_chap": false, 00:06:13.880 "mutual_chap": false, 00:06:13.880 "chap_group": 0, 00:06:13.880 "max_large_datain_per_connection": 64, 00:06:13.880 "max_r2t_per_connection": 4, 00:06:13.880 "pdu_pool_size": 36864, 00:06:13.880 "immediate_data_pool_size": 16384, 00:06:13.880 "data_out_pool_size": 2048 00:06:13.880 } 00:06:13.880 } 00:06:13.880 ] 00:06:13.880 }, 00:06:13.880 { 00:06:13.880 "subsystem": "vhost_scsi", 00:06:13.880 "config": [] 00:06:13.880 } 00:06:13.880 ] 00:06:13.880 } 00:06:13.880 08:18:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:13.880 08:18:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 977729 00:06:13.880 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 977729 ']' 00:06:13.880 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 977729 00:06:13.880 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:06:13.880 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:13.880 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 977729 00:06:13.880 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:13.880 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:13.880 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 977729' 00:06:13.880 killing process with pid 977729 00:06:13.880 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 977729 00:06:13.880 08:18:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 977729 00:06:14.448 08:18:27 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=977759 00:06:14.448 08:18:27 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:14.448 08:18:27 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 977759 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 977759 ']' 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 977759 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 977759 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 977759' 00:06:19.721 killing process with pid 977759 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 977759 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 977759 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:19.721 00:06:19.721 real 0m6.283s 00:06:19.721 user 0m5.961s 00:06:19.721 sys 0m0.652s 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:19.721 ************************************ 00:06:19.721 END TEST skip_rpc_with_json 00:06:19.721 ************************************ 00:06:19.721 08:18:32 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:19.721 08:18:32 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:19.721 08:18:32 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:19.721 08:18:32 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:19.721 ************************************ 00:06:19.721 START TEST skip_rpc_with_delay 00:06:19.721 ************************************ 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_delay 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # local es=0 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:19.721 [2024-11-17 08:18:32.798749] app.c: 840:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:19.721 [2024-11-17 08:18:32.798883] app.c: 719:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # es=1 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:19.721 00:06:19.721 real 0m0.045s 00:06:19.721 user 0m0.024s 00:06:19.721 sys 0m0.021s 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:19.721 08:18:32 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:19.721 ************************************ 00:06:19.721 END TEST skip_rpc_with_delay 00:06:19.721 ************************************ 00:06:19.721 08:18:32 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:19.981 08:18:32 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:19.981 08:18:32 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:19.981 08:18:32 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:19.981 08:18:32 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:19.981 08:18:32 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:19.981 ************************************ 00:06:19.981 START TEST exit_on_failed_rpc_init 00:06:19.981 ************************************ 00:06:19.981 08:18:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1125 -- # test_exit_on_failed_rpc_init 00:06:19.981 08:18:32 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=978858 00:06:19.981 08:18:32 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 978858 00:06:19.981 08:18:32 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:19.981 08:18:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@831 -- # '[' -z 978858 ']' 00:06:19.981 08:18:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:19.981 08:18:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:19.981 08:18:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:19.981 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:19.981 08:18:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:19.981 08:18:32 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:19.982 [2024-11-17 08:18:32.929313] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:19.982 [2024-11-17 08:18:32.929395] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid978858 ] 00:06:19.982 [2024-11-17 08:18:32.999086] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.982 [2024-11-17 08:18:33.041309] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.241 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:20.241 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # return 0 00:06:20.241 08:18:33 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:20.241 08:18:33 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:20.241 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # local es=0 00:06:20.241 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:20.241 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:20.241 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:20.241 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:20.241 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:20.241 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:20.241 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:20.241 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:20.241 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:20.241 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:20.241 [2024-11-17 08:18:33.267910] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:20.241 [2024-11-17 08:18:33.267967] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid978871 ] 00:06:20.241 [2024-11-17 08:18:33.333188] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:20.241 [2024-11-17 08:18:33.371577] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:20.241 [2024-11-17 08:18:33.371661] rpc.c: 181:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:20.241 [2024-11-17 08:18:33.371673] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:20.242 [2024-11-17 08:18:33.371682] app.c:1061:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:20.502 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # es=234 00:06:20.502 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:20.502 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@662 -- # es=106 00:06:20.502 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # case "$es" in 00:06:20.502 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@670 -- # es=1 00:06:20.502 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:20.502 08:18:33 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:20.502 08:18:33 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 978858 00:06:20.502 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@950 -- # '[' -z 978858 ']' 00:06:20.502 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # kill -0 978858 00:06:20.502 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # uname 00:06:20.502 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:20.502 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 978858 00:06:20.502 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:20.502 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:20.502 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@968 -- # echo 'killing process with pid 978858' 00:06:20.502 killing process with pid 978858 00:06:20.502 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@969 -- # kill 978858 00:06:20.502 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@974 -- # wait 978858 00:06:20.762 00:06:20.762 real 0m0.905s 00:06:20.762 user 0m0.915s 00:06:20.762 sys 0m0.427s 00:06:20.762 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:20.762 08:18:33 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:20.762 ************************************ 00:06:20.762 END TEST exit_on_failed_rpc_init 00:06:20.762 ************************************ 00:06:20.762 08:18:33 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:20.762 00:06:20.762 real 0m13.160s 00:06:20.762 user 0m12.268s 00:06:20.762 sys 0m1.754s 00:06:20.762 08:18:33 skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:20.762 08:18:33 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:20.762 ************************************ 00:06:20.762 END TEST skip_rpc 00:06:20.762 ************************************ 00:06:20.762 08:18:33 -- spdk/autotest.sh@158 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:20.762 08:18:33 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:20.762 08:18:33 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:20.762 08:18:33 -- common/autotest_common.sh@10 -- # set +x 00:06:21.023 ************************************ 00:06:21.023 START TEST rpc_client 00:06:21.023 ************************************ 00:06:21.023 08:18:33 rpc_client -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:21.023 * Looking for test storage... 00:06:21.023 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:06:21.023 08:18:34 rpc_client -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:21.023 08:18:34 rpc_client -- common/autotest_common.sh@1681 -- # lcov --version 00:06:21.023 08:18:34 rpc_client -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:21.023 08:18:34 rpc_client -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@336 -- # IFS=.-: 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@336 -- # read -ra ver1 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@337 -- # IFS=.-: 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@337 -- # read -ra ver2 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@338 -- # local 'op=<' 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@340 -- # ver1_l=2 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@341 -- # ver2_l=1 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@344 -- # case "$op" in 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@345 -- # : 1 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@365 -- # decimal 1 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@353 -- # local d=1 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@355 -- # echo 1 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@365 -- # ver1[v]=1 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@366 -- # decimal 2 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@353 -- # local d=2 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@355 -- # echo 2 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@366 -- # ver2[v]=2 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:21.023 08:18:34 rpc_client -- scripts/common.sh@368 -- # return 0 00:06:21.023 08:18:34 rpc_client -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:21.023 08:18:34 rpc_client -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:21.023 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:21.023 --rc genhtml_branch_coverage=1 00:06:21.023 --rc genhtml_function_coverage=1 00:06:21.023 --rc genhtml_legend=1 00:06:21.023 --rc geninfo_all_blocks=1 00:06:21.023 --rc geninfo_unexecuted_blocks=1 00:06:21.023 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:21.023 ' 00:06:21.023 08:18:34 rpc_client -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:21.023 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:21.023 --rc genhtml_branch_coverage=1 00:06:21.023 --rc genhtml_function_coverage=1 00:06:21.023 --rc genhtml_legend=1 00:06:21.023 --rc geninfo_all_blocks=1 00:06:21.023 --rc geninfo_unexecuted_blocks=1 00:06:21.023 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:21.023 ' 00:06:21.023 08:18:34 rpc_client -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:21.023 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:21.023 --rc genhtml_branch_coverage=1 00:06:21.023 --rc genhtml_function_coverage=1 00:06:21.023 --rc genhtml_legend=1 00:06:21.023 --rc geninfo_all_blocks=1 00:06:21.023 --rc geninfo_unexecuted_blocks=1 00:06:21.023 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:21.023 ' 00:06:21.023 08:18:34 rpc_client -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:21.023 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:21.023 --rc genhtml_branch_coverage=1 00:06:21.023 --rc genhtml_function_coverage=1 00:06:21.023 --rc genhtml_legend=1 00:06:21.023 --rc geninfo_all_blocks=1 00:06:21.023 --rc geninfo_unexecuted_blocks=1 00:06:21.023 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:21.023 ' 00:06:21.023 08:18:34 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:21.023 OK 00:06:21.023 08:18:34 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:21.023 00:06:21.023 real 0m0.209s 00:06:21.023 user 0m0.115s 00:06:21.023 sys 0m0.111s 00:06:21.023 08:18:34 rpc_client -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:21.023 08:18:34 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:21.023 ************************************ 00:06:21.023 END TEST rpc_client 00:06:21.023 ************************************ 00:06:21.284 08:18:34 -- spdk/autotest.sh@159 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:21.284 08:18:34 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:21.284 08:18:34 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:21.284 08:18:34 -- common/autotest_common.sh@10 -- # set +x 00:06:21.284 ************************************ 00:06:21.284 START TEST json_config 00:06:21.284 ************************************ 00:06:21.284 08:18:34 json_config -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:21.284 08:18:34 json_config -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:21.284 08:18:34 json_config -- common/autotest_common.sh@1681 -- # lcov --version 00:06:21.284 08:18:34 json_config -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:21.284 08:18:34 json_config -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:21.284 08:18:34 json_config -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:21.284 08:18:34 json_config -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:21.284 08:18:34 json_config -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:21.284 08:18:34 json_config -- scripts/common.sh@336 -- # IFS=.-: 00:06:21.284 08:18:34 json_config -- scripts/common.sh@336 -- # read -ra ver1 00:06:21.284 08:18:34 json_config -- scripts/common.sh@337 -- # IFS=.-: 00:06:21.284 08:18:34 json_config -- scripts/common.sh@337 -- # read -ra ver2 00:06:21.284 08:18:34 json_config -- scripts/common.sh@338 -- # local 'op=<' 00:06:21.284 08:18:34 json_config -- scripts/common.sh@340 -- # ver1_l=2 00:06:21.284 08:18:34 json_config -- scripts/common.sh@341 -- # ver2_l=1 00:06:21.284 08:18:34 json_config -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:21.284 08:18:34 json_config -- scripts/common.sh@344 -- # case "$op" in 00:06:21.284 08:18:34 json_config -- scripts/common.sh@345 -- # : 1 00:06:21.284 08:18:34 json_config -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:21.284 08:18:34 json_config -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:21.284 08:18:34 json_config -- scripts/common.sh@365 -- # decimal 1 00:06:21.284 08:18:34 json_config -- scripts/common.sh@353 -- # local d=1 00:06:21.284 08:18:34 json_config -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:21.284 08:18:34 json_config -- scripts/common.sh@355 -- # echo 1 00:06:21.284 08:18:34 json_config -- scripts/common.sh@365 -- # ver1[v]=1 00:06:21.284 08:18:34 json_config -- scripts/common.sh@366 -- # decimal 2 00:06:21.284 08:18:34 json_config -- scripts/common.sh@353 -- # local d=2 00:06:21.284 08:18:34 json_config -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:21.284 08:18:34 json_config -- scripts/common.sh@355 -- # echo 2 00:06:21.284 08:18:34 json_config -- scripts/common.sh@366 -- # ver2[v]=2 00:06:21.284 08:18:34 json_config -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:21.284 08:18:34 json_config -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:21.284 08:18:34 json_config -- scripts/common.sh@368 -- # return 0 00:06:21.284 08:18:34 json_config -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:21.284 08:18:34 json_config -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:21.284 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:21.284 --rc genhtml_branch_coverage=1 00:06:21.284 --rc genhtml_function_coverage=1 00:06:21.284 --rc genhtml_legend=1 00:06:21.284 --rc geninfo_all_blocks=1 00:06:21.284 --rc geninfo_unexecuted_blocks=1 00:06:21.284 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:21.284 ' 00:06:21.284 08:18:34 json_config -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:21.284 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:21.284 --rc genhtml_branch_coverage=1 00:06:21.284 --rc genhtml_function_coverage=1 00:06:21.284 --rc genhtml_legend=1 00:06:21.284 --rc geninfo_all_blocks=1 00:06:21.284 --rc geninfo_unexecuted_blocks=1 00:06:21.284 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:21.284 ' 00:06:21.284 08:18:34 json_config -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:21.284 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:21.284 --rc genhtml_branch_coverage=1 00:06:21.284 --rc genhtml_function_coverage=1 00:06:21.284 --rc genhtml_legend=1 00:06:21.284 --rc geninfo_all_blocks=1 00:06:21.284 --rc geninfo_unexecuted_blocks=1 00:06:21.284 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:21.284 ' 00:06:21.284 08:18:34 json_config -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:21.284 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:21.284 --rc genhtml_branch_coverage=1 00:06:21.284 --rc genhtml_function_coverage=1 00:06:21.284 --rc genhtml_legend=1 00:06:21.284 --rc geninfo_all_blocks=1 00:06:21.284 --rc geninfo_unexecuted_blocks=1 00:06:21.284 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:21.284 ' 00:06:21.284 08:18:34 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:21.284 08:18:34 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:21.284 08:18:34 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:21.284 08:18:34 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:21.285 08:18:34 json_config -- scripts/common.sh@15 -- # shopt -s extglob 00:06:21.285 08:18:34 json_config -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:21.285 08:18:34 json_config -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:21.285 08:18:34 json_config -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:21.285 08:18:34 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:21.285 08:18:34 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:21.285 08:18:34 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:21.285 08:18:34 json_config -- paths/export.sh@5 -- # export PATH 00:06:21.285 08:18:34 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@51 -- # : 0 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:06:21.285 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:06:21.285 08:18:34 json_config -- nvmf/common.sh@55 -- # have_pci_nics=0 00:06:21.285 08:18:34 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:21.285 08:18:34 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:21.285 08:18:34 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:21.285 08:18:34 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:21.285 08:18:34 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:21.285 08:18:34 json_config -- json_config/json_config.sh@27 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:06:21.285 WARNING: No tests are enabled so not running JSON configuration tests 00:06:21.285 08:18:34 json_config -- json_config/json_config.sh@28 -- # exit 0 00:06:21.285 00:06:21.285 real 0m0.198s 00:06:21.285 user 0m0.125s 00:06:21.285 sys 0m0.081s 00:06:21.285 08:18:34 json_config -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:21.285 08:18:34 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:21.285 ************************************ 00:06:21.285 END TEST json_config 00:06:21.285 ************************************ 00:06:21.546 08:18:34 -- spdk/autotest.sh@160 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:21.546 08:18:34 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:21.546 08:18:34 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:21.546 08:18:34 -- common/autotest_common.sh@10 -- # set +x 00:06:21.546 ************************************ 00:06:21.546 START TEST json_config_extra_key 00:06:21.546 ************************************ 00:06:21.546 08:18:34 json_config_extra_key -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:21.546 08:18:34 json_config_extra_key -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:21.546 08:18:34 json_config_extra_key -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:21.546 08:18:34 json_config_extra_key -- common/autotest_common.sh@1681 -- # lcov --version 00:06:21.546 08:18:34 json_config_extra_key -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@336 -- # IFS=.-: 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@336 -- # read -ra ver1 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@337 -- # IFS=.-: 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@337 -- # read -ra ver2 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@338 -- # local 'op=<' 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@340 -- # ver1_l=2 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@341 -- # ver2_l=1 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@344 -- # case "$op" in 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@345 -- # : 1 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@365 -- # decimal 1 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@353 -- # local d=1 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@355 -- # echo 1 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@365 -- # ver1[v]=1 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@366 -- # decimal 2 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@353 -- # local d=2 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@355 -- # echo 2 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@366 -- # ver2[v]=2 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:21.546 08:18:34 json_config_extra_key -- scripts/common.sh@368 -- # return 0 00:06:21.546 08:18:34 json_config_extra_key -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:21.546 08:18:34 json_config_extra_key -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:21.546 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:21.546 --rc genhtml_branch_coverage=1 00:06:21.546 --rc genhtml_function_coverage=1 00:06:21.546 --rc genhtml_legend=1 00:06:21.546 --rc geninfo_all_blocks=1 00:06:21.546 --rc geninfo_unexecuted_blocks=1 00:06:21.546 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:21.546 ' 00:06:21.546 08:18:34 json_config_extra_key -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:21.546 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:21.546 --rc genhtml_branch_coverage=1 00:06:21.546 --rc genhtml_function_coverage=1 00:06:21.546 --rc genhtml_legend=1 00:06:21.546 --rc geninfo_all_blocks=1 00:06:21.546 --rc geninfo_unexecuted_blocks=1 00:06:21.546 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:21.546 ' 00:06:21.546 08:18:34 json_config_extra_key -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:21.546 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:21.546 --rc genhtml_branch_coverage=1 00:06:21.546 --rc genhtml_function_coverage=1 00:06:21.546 --rc genhtml_legend=1 00:06:21.546 --rc geninfo_all_blocks=1 00:06:21.546 --rc geninfo_unexecuted_blocks=1 00:06:21.546 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:21.546 ' 00:06:21.546 08:18:34 json_config_extra_key -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:21.546 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:21.546 --rc genhtml_branch_coverage=1 00:06:21.546 --rc genhtml_function_coverage=1 00:06:21.546 --rc genhtml_legend=1 00:06:21.546 --rc geninfo_all_blocks=1 00:06:21.546 --rc geninfo_unexecuted_blocks=1 00:06:21.546 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:21.546 ' 00:06:21.546 08:18:34 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:21.546 08:18:34 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:06:21.546 08:18:34 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:21.546 08:18:34 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:21.546 08:18:34 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:21.547 08:18:34 json_config_extra_key -- scripts/common.sh@15 -- # shopt -s extglob 00:06:21.547 08:18:34 json_config_extra_key -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:21.547 08:18:34 json_config_extra_key -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:21.547 08:18:34 json_config_extra_key -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:21.547 08:18:34 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:21.547 08:18:34 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:21.547 08:18:34 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:21.547 08:18:34 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:06:21.547 08:18:34 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@51 -- # : 0 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:06:21.547 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:06:21.547 08:18:34 json_config_extra_key -- nvmf/common.sh@55 -- # have_pci_nics=0 00:06:21.807 08:18:34 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:21.807 08:18:34 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:21.807 08:18:34 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:21.808 08:18:34 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:21.808 08:18:34 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:21.808 08:18:34 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:21.808 08:18:34 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:21.808 08:18:34 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:21.808 08:18:34 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:21.808 08:18:34 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:21.808 08:18:34 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:21.808 INFO: launching applications... 00:06:21.808 08:18:34 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:21.808 08:18:34 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:06:21.808 08:18:34 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:06:21.808 08:18:34 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:21.808 08:18:34 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:21.808 08:18:34 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:06:21.808 08:18:34 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:21.808 08:18:34 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:21.808 08:18:34 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=979307 00:06:21.808 08:18:34 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:21.808 Waiting for target to run... 00:06:21.808 08:18:34 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 979307 /var/tmp/spdk_tgt.sock 00:06:21.808 08:18:34 json_config_extra_key -- common/autotest_common.sh@831 -- # '[' -z 979307 ']' 00:06:21.808 08:18:34 json_config_extra_key -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:21.808 08:18:34 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:21.808 08:18:34 json_config_extra_key -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:21.808 08:18:34 json_config_extra_key -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:21.808 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:21.808 08:18:34 json_config_extra_key -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:21.808 08:18:34 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:21.808 [2024-11-17 08:18:34.713827] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:21.808 [2024-11-17 08:18:34.713888] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid979307 ] 00:06:22.068 [2024-11-17 08:18:34.996516] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.068 [2024-11-17 08:18:35.017869] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.640 08:18:35 json_config_extra_key -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:22.640 08:18:35 json_config_extra_key -- common/autotest_common.sh@864 -- # return 0 00:06:22.640 08:18:35 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:06:22.640 00:06:22.640 08:18:35 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:22.641 INFO: shutting down applications... 00:06:22.641 08:18:35 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:22.641 08:18:35 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:06:22.641 08:18:35 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:22.641 08:18:35 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 979307 ]] 00:06:22.641 08:18:35 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 979307 00:06:22.641 08:18:35 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:22.641 08:18:35 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:22.641 08:18:35 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 979307 00:06:22.641 08:18:35 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:06:23.212 08:18:36 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:06:23.212 08:18:36 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:23.212 08:18:36 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 979307 00:06:23.212 08:18:36 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:23.212 08:18:36 json_config_extra_key -- json_config/common.sh@43 -- # break 00:06:23.212 08:18:36 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:23.212 08:18:36 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:23.212 SPDK target shutdown done 00:06:23.212 08:18:36 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:23.212 Success 00:06:23.212 00:06:23.212 real 0m1.594s 00:06:23.212 user 0m1.349s 00:06:23.212 sys 0m0.424s 00:06:23.212 08:18:36 json_config_extra_key -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:23.212 08:18:36 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:23.212 ************************************ 00:06:23.212 END TEST json_config_extra_key 00:06:23.212 ************************************ 00:06:23.212 08:18:36 -- spdk/autotest.sh@161 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:23.212 08:18:36 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:23.212 08:18:36 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:23.212 08:18:36 -- common/autotest_common.sh@10 -- # set +x 00:06:23.212 ************************************ 00:06:23.212 START TEST alias_rpc 00:06:23.212 ************************************ 00:06:23.212 08:18:36 alias_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:23.212 * Looking for test storage... 00:06:23.212 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:06:23.212 08:18:36 alias_rpc -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:23.212 08:18:36 alias_rpc -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:23.212 08:18:36 alias_rpc -- common/autotest_common.sh@1681 -- # lcov --version 00:06:23.212 08:18:36 alias_rpc -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@345 -- # : 1 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@365 -- # decimal 1 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@353 -- # local d=1 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@355 -- # echo 1 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@366 -- # decimal 2 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@353 -- # local d=2 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@355 -- # echo 2 00:06:23.212 08:18:36 alias_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:23.472 08:18:36 alias_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:23.472 08:18:36 alias_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:23.472 08:18:36 alias_rpc -- scripts/common.sh@368 -- # return 0 00:06:23.472 08:18:36 alias_rpc -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:23.472 08:18:36 alias_rpc -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:23.472 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:23.472 --rc genhtml_branch_coverage=1 00:06:23.472 --rc genhtml_function_coverage=1 00:06:23.472 --rc genhtml_legend=1 00:06:23.472 --rc geninfo_all_blocks=1 00:06:23.472 --rc geninfo_unexecuted_blocks=1 00:06:23.472 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:23.472 ' 00:06:23.472 08:18:36 alias_rpc -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:23.472 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:23.472 --rc genhtml_branch_coverage=1 00:06:23.472 --rc genhtml_function_coverage=1 00:06:23.472 --rc genhtml_legend=1 00:06:23.472 --rc geninfo_all_blocks=1 00:06:23.472 --rc geninfo_unexecuted_blocks=1 00:06:23.472 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:23.472 ' 00:06:23.472 08:18:36 alias_rpc -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:23.472 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:23.472 --rc genhtml_branch_coverage=1 00:06:23.472 --rc genhtml_function_coverage=1 00:06:23.472 --rc genhtml_legend=1 00:06:23.472 --rc geninfo_all_blocks=1 00:06:23.472 --rc geninfo_unexecuted_blocks=1 00:06:23.472 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:23.472 ' 00:06:23.472 08:18:36 alias_rpc -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:23.472 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:23.472 --rc genhtml_branch_coverage=1 00:06:23.472 --rc genhtml_function_coverage=1 00:06:23.472 --rc genhtml_legend=1 00:06:23.472 --rc geninfo_all_blocks=1 00:06:23.472 --rc geninfo_unexecuted_blocks=1 00:06:23.472 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:23.472 ' 00:06:23.472 08:18:36 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:23.472 08:18:36 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=979626 00:06:23.472 08:18:36 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:23.472 08:18:36 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 979626 00:06:23.472 08:18:36 alias_rpc -- common/autotest_common.sh@831 -- # '[' -z 979626 ']' 00:06:23.472 08:18:36 alias_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:23.472 08:18:36 alias_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:23.472 08:18:36 alias_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:23.472 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:23.472 08:18:36 alias_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:23.472 08:18:36 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:23.472 [2024-11-17 08:18:36.377148] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:23.472 [2024-11-17 08:18:36.377209] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid979626 ] 00:06:23.472 [2024-11-17 08:18:36.442579] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:23.472 [2024-11-17 08:18:36.480202] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.733 08:18:36 alias_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:23.733 08:18:36 alias_rpc -- common/autotest_common.sh@864 -- # return 0 00:06:23.733 08:18:36 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:23.993 08:18:36 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 979626 00:06:23.993 08:18:36 alias_rpc -- common/autotest_common.sh@950 -- # '[' -z 979626 ']' 00:06:23.993 08:18:36 alias_rpc -- common/autotest_common.sh@954 -- # kill -0 979626 00:06:23.993 08:18:36 alias_rpc -- common/autotest_common.sh@955 -- # uname 00:06:23.993 08:18:36 alias_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:23.993 08:18:36 alias_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 979626 00:06:23.993 08:18:36 alias_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:23.993 08:18:36 alias_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:23.993 08:18:36 alias_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 979626' 00:06:23.993 killing process with pid 979626 00:06:23.993 08:18:36 alias_rpc -- common/autotest_common.sh@969 -- # kill 979626 00:06:23.993 08:18:36 alias_rpc -- common/autotest_common.sh@974 -- # wait 979626 00:06:24.254 00:06:24.254 real 0m1.104s 00:06:24.254 user 0m1.061s 00:06:24.254 sys 0m0.471s 00:06:24.254 08:18:37 alias_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:24.254 08:18:37 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:24.254 ************************************ 00:06:24.254 END TEST alias_rpc 00:06:24.254 ************************************ 00:06:24.254 08:18:37 -- spdk/autotest.sh@163 -- # [[ 0 -eq 0 ]] 00:06:24.254 08:18:37 -- spdk/autotest.sh@164 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:24.254 08:18:37 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:24.254 08:18:37 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:24.254 08:18:37 -- common/autotest_common.sh@10 -- # set +x 00:06:24.254 ************************************ 00:06:24.254 START TEST spdkcli_tcp 00:06:24.254 ************************************ 00:06:24.254 08:18:37 spdkcli_tcp -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:24.514 * Looking for test storage... 00:06:24.514 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:06:24.514 08:18:37 spdkcli_tcp -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:24.514 08:18:37 spdkcli_tcp -- common/autotest_common.sh@1681 -- # lcov --version 00:06:24.514 08:18:37 spdkcli_tcp -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:24.514 08:18:37 spdkcli_tcp -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@336 -- # IFS=.-: 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@336 -- # read -ra ver1 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@337 -- # IFS=.-: 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@337 -- # read -ra ver2 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@338 -- # local 'op=<' 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@340 -- # ver1_l=2 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@341 -- # ver2_l=1 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@344 -- # case "$op" in 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@345 -- # : 1 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@365 -- # decimal 1 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@353 -- # local d=1 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@355 -- # echo 1 00:06:24.514 08:18:37 spdkcli_tcp -- scripts/common.sh@365 -- # ver1[v]=1 00:06:24.515 08:18:37 spdkcli_tcp -- scripts/common.sh@366 -- # decimal 2 00:06:24.515 08:18:37 spdkcli_tcp -- scripts/common.sh@353 -- # local d=2 00:06:24.515 08:18:37 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:24.515 08:18:37 spdkcli_tcp -- scripts/common.sh@355 -- # echo 2 00:06:24.515 08:18:37 spdkcli_tcp -- scripts/common.sh@366 -- # ver2[v]=2 00:06:24.515 08:18:37 spdkcli_tcp -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:24.515 08:18:37 spdkcli_tcp -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:24.515 08:18:37 spdkcli_tcp -- scripts/common.sh@368 -- # return 0 00:06:24.515 08:18:37 spdkcli_tcp -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:24.515 08:18:37 spdkcli_tcp -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:24.515 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:24.515 --rc genhtml_branch_coverage=1 00:06:24.515 --rc genhtml_function_coverage=1 00:06:24.515 --rc genhtml_legend=1 00:06:24.515 --rc geninfo_all_blocks=1 00:06:24.515 --rc geninfo_unexecuted_blocks=1 00:06:24.515 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:24.515 ' 00:06:24.515 08:18:37 spdkcli_tcp -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:24.515 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:24.515 --rc genhtml_branch_coverage=1 00:06:24.515 --rc genhtml_function_coverage=1 00:06:24.515 --rc genhtml_legend=1 00:06:24.515 --rc geninfo_all_blocks=1 00:06:24.515 --rc geninfo_unexecuted_blocks=1 00:06:24.515 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:24.515 ' 00:06:24.515 08:18:37 spdkcli_tcp -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:24.515 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:24.515 --rc genhtml_branch_coverage=1 00:06:24.515 --rc genhtml_function_coverage=1 00:06:24.515 --rc genhtml_legend=1 00:06:24.515 --rc geninfo_all_blocks=1 00:06:24.515 --rc geninfo_unexecuted_blocks=1 00:06:24.515 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:24.515 ' 00:06:24.515 08:18:37 spdkcli_tcp -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:24.515 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:24.515 --rc genhtml_branch_coverage=1 00:06:24.515 --rc genhtml_function_coverage=1 00:06:24.515 --rc genhtml_legend=1 00:06:24.515 --rc geninfo_all_blocks=1 00:06:24.515 --rc geninfo_unexecuted_blocks=1 00:06:24.515 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:24.515 ' 00:06:24.515 08:18:37 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:06:24.515 08:18:37 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:24.515 08:18:37 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:06:24.515 08:18:37 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:24.515 08:18:37 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:24.515 08:18:37 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:24.515 08:18:37 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:24.515 08:18:37 spdkcli_tcp -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:24.515 08:18:37 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:24.515 08:18:37 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=979947 00:06:24.515 08:18:37 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 979947 00:06:24.515 08:18:37 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:24.515 08:18:37 spdkcli_tcp -- common/autotest_common.sh@831 -- # '[' -z 979947 ']' 00:06:24.515 08:18:37 spdkcli_tcp -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:24.515 08:18:37 spdkcli_tcp -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:24.515 08:18:37 spdkcli_tcp -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:24.515 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:24.515 08:18:37 spdkcli_tcp -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:24.515 08:18:37 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:24.515 [2024-11-17 08:18:37.546217] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:24.515 [2024-11-17 08:18:37.546272] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid979947 ] 00:06:24.515 [2024-11-17 08:18:37.611885] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:24.775 [2024-11-17 08:18:37.652843] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:24.775 [2024-11-17 08:18:37.652846] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.775 08:18:37 spdkcli_tcp -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:24.775 08:18:37 spdkcli_tcp -- common/autotest_common.sh@864 -- # return 0 00:06:24.775 08:18:37 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=979966 00:06:24.775 08:18:37 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:24.775 08:18:37 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:25.036 [ 00:06:25.036 "spdk_get_version", 00:06:25.036 "rpc_get_methods", 00:06:25.036 "notify_get_notifications", 00:06:25.036 "notify_get_types", 00:06:25.036 "trace_get_info", 00:06:25.036 "trace_get_tpoint_group_mask", 00:06:25.036 "trace_disable_tpoint_group", 00:06:25.036 "trace_enable_tpoint_group", 00:06:25.036 "trace_clear_tpoint_mask", 00:06:25.036 "trace_set_tpoint_mask", 00:06:25.036 "fsdev_set_opts", 00:06:25.036 "fsdev_get_opts", 00:06:25.036 "framework_get_pci_devices", 00:06:25.036 "framework_get_config", 00:06:25.036 "framework_get_subsystems", 00:06:25.036 "vfu_tgt_set_base_path", 00:06:25.036 "keyring_get_keys", 00:06:25.036 "iobuf_get_stats", 00:06:25.036 "iobuf_set_options", 00:06:25.036 "sock_get_default_impl", 00:06:25.036 "sock_set_default_impl", 00:06:25.036 "sock_impl_set_options", 00:06:25.036 "sock_impl_get_options", 00:06:25.036 "vmd_rescan", 00:06:25.036 "vmd_remove_device", 00:06:25.036 "vmd_enable", 00:06:25.036 "accel_get_stats", 00:06:25.036 "accel_set_options", 00:06:25.036 "accel_set_driver", 00:06:25.036 "accel_crypto_key_destroy", 00:06:25.036 "accel_crypto_keys_get", 00:06:25.036 "accel_crypto_key_create", 00:06:25.036 "accel_assign_opc", 00:06:25.036 "accel_get_module_info", 00:06:25.036 "accel_get_opc_assignments", 00:06:25.036 "bdev_get_histogram", 00:06:25.036 "bdev_enable_histogram", 00:06:25.036 "bdev_set_qos_limit", 00:06:25.036 "bdev_set_qd_sampling_period", 00:06:25.036 "bdev_get_bdevs", 00:06:25.036 "bdev_reset_iostat", 00:06:25.036 "bdev_get_iostat", 00:06:25.036 "bdev_examine", 00:06:25.036 "bdev_wait_for_examine", 00:06:25.036 "bdev_set_options", 00:06:25.036 "scsi_get_devices", 00:06:25.036 "thread_set_cpumask", 00:06:25.036 "scheduler_set_options", 00:06:25.036 "framework_get_governor", 00:06:25.036 "framework_get_scheduler", 00:06:25.036 "framework_set_scheduler", 00:06:25.036 "framework_get_reactors", 00:06:25.036 "thread_get_io_channels", 00:06:25.036 "thread_get_pollers", 00:06:25.036 "thread_get_stats", 00:06:25.036 "framework_monitor_context_switch", 00:06:25.036 "spdk_kill_instance", 00:06:25.036 "log_enable_timestamps", 00:06:25.036 "log_get_flags", 00:06:25.036 "log_clear_flag", 00:06:25.036 "log_set_flag", 00:06:25.036 "log_get_level", 00:06:25.036 "log_set_level", 00:06:25.036 "log_get_print_level", 00:06:25.036 "log_set_print_level", 00:06:25.036 "framework_enable_cpumask_locks", 00:06:25.036 "framework_disable_cpumask_locks", 00:06:25.036 "framework_wait_init", 00:06:25.036 "framework_start_init", 00:06:25.036 "virtio_blk_create_transport", 00:06:25.036 "virtio_blk_get_transports", 00:06:25.036 "vhost_controller_set_coalescing", 00:06:25.036 "vhost_get_controllers", 00:06:25.036 "vhost_delete_controller", 00:06:25.036 "vhost_create_blk_controller", 00:06:25.036 "vhost_scsi_controller_remove_target", 00:06:25.036 "vhost_scsi_controller_add_target", 00:06:25.036 "vhost_start_scsi_controller", 00:06:25.036 "vhost_create_scsi_controller", 00:06:25.036 "ublk_recover_disk", 00:06:25.036 "ublk_get_disks", 00:06:25.036 "ublk_stop_disk", 00:06:25.036 "ublk_start_disk", 00:06:25.036 "ublk_destroy_target", 00:06:25.036 "ublk_create_target", 00:06:25.036 "nbd_get_disks", 00:06:25.036 "nbd_stop_disk", 00:06:25.036 "nbd_start_disk", 00:06:25.036 "env_dpdk_get_mem_stats", 00:06:25.036 "nvmf_stop_mdns_prr", 00:06:25.036 "nvmf_publish_mdns_prr", 00:06:25.036 "nvmf_subsystem_get_listeners", 00:06:25.036 "nvmf_subsystem_get_qpairs", 00:06:25.036 "nvmf_subsystem_get_controllers", 00:06:25.036 "nvmf_get_stats", 00:06:25.036 "nvmf_get_transports", 00:06:25.036 "nvmf_create_transport", 00:06:25.036 "nvmf_get_targets", 00:06:25.036 "nvmf_delete_target", 00:06:25.036 "nvmf_create_target", 00:06:25.036 "nvmf_subsystem_allow_any_host", 00:06:25.036 "nvmf_subsystem_set_keys", 00:06:25.036 "nvmf_subsystem_remove_host", 00:06:25.036 "nvmf_subsystem_add_host", 00:06:25.036 "nvmf_ns_remove_host", 00:06:25.036 "nvmf_ns_add_host", 00:06:25.036 "nvmf_subsystem_remove_ns", 00:06:25.036 "nvmf_subsystem_set_ns_ana_group", 00:06:25.036 "nvmf_subsystem_add_ns", 00:06:25.036 "nvmf_subsystem_listener_set_ana_state", 00:06:25.036 "nvmf_discovery_get_referrals", 00:06:25.036 "nvmf_discovery_remove_referral", 00:06:25.036 "nvmf_discovery_add_referral", 00:06:25.036 "nvmf_subsystem_remove_listener", 00:06:25.036 "nvmf_subsystem_add_listener", 00:06:25.036 "nvmf_delete_subsystem", 00:06:25.036 "nvmf_create_subsystem", 00:06:25.036 "nvmf_get_subsystems", 00:06:25.036 "nvmf_set_crdt", 00:06:25.036 "nvmf_set_config", 00:06:25.036 "nvmf_set_max_subsystems", 00:06:25.036 "iscsi_get_histogram", 00:06:25.037 "iscsi_enable_histogram", 00:06:25.037 "iscsi_set_options", 00:06:25.037 "iscsi_get_auth_groups", 00:06:25.037 "iscsi_auth_group_remove_secret", 00:06:25.037 "iscsi_auth_group_add_secret", 00:06:25.037 "iscsi_delete_auth_group", 00:06:25.037 "iscsi_create_auth_group", 00:06:25.037 "iscsi_set_discovery_auth", 00:06:25.037 "iscsi_get_options", 00:06:25.037 "iscsi_target_node_request_logout", 00:06:25.037 "iscsi_target_node_set_redirect", 00:06:25.037 "iscsi_target_node_set_auth", 00:06:25.037 "iscsi_target_node_add_lun", 00:06:25.037 "iscsi_get_stats", 00:06:25.037 "iscsi_get_connections", 00:06:25.037 "iscsi_portal_group_set_auth", 00:06:25.037 "iscsi_start_portal_group", 00:06:25.037 "iscsi_delete_portal_group", 00:06:25.037 "iscsi_create_portal_group", 00:06:25.037 "iscsi_get_portal_groups", 00:06:25.037 "iscsi_delete_target_node", 00:06:25.037 "iscsi_target_node_remove_pg_ig_maps", 00:06:25.037 "iscsi_target_node_add_pg_ig_maps", 00:06:25.037 "iscsi_create_target_node", 00:06:25.037 "iscsi_get_target_nodes", 00:06:25.037 "iscsi_delete_initiator_group", 00:06:25.037 "iscsi_initiator_group_remove_initiators", 00:06:25.037 "iscsi_initiator_group_add_initiators", 00:06:25.037 "iscsi_create_initiator_group", 00:06:25.037 "iscsi_get_initiator_groups", 00:06:25.037 "fsdev_aio_delete", 00:06:25.037 "fsdev_aio_create", 00:06:25.037 "keyring_linux_set_options", 00:06:25.037 "keyring_file_remove_key", 00:06:25.037 "keyring_file_add_key", 00:06:25.037 "vfu_virtio_create_fs_endpoint", 00:06:25.037 "vfu_virtio_create_scsi_endpoint", 00:06:25.037 "vfu_virtio_scsi_remove_target", 00:06:25.037 "vfu_virtio_scsi_add_target", 00:06:25.037 "vfu_virtio_create_blk_endpoint", 00:06:25.037 "vfu_virtio_delete_endpoint", 00:06:25.037 "iaa_scan_accel_module", 00:06:25.037 "dsa_scan_accel_module", 00:06:25.037 "ioat_scan_accel_module", 00:06:25.037 "accel_error_inject_error", 00:06:25.037 "bdev_iscsi_delete", 00:06:25.037 "bdev_iscsi_create", 00:06:25.037 "bdev_iscsi_set_options", 00:06:25.037 "bdev_virtio_attach_controller", 00:06:25.037 "bdev_virtio_scsi_get_devices", 00:06:25.037 "bdev_virtio_detach_controller", 00:06:25.037 "bdev_virtio_blk_set_hotplug", 00:06:25.037 "bdev_ftl_set_property", 00:06:25.037 "bdev_ftl_get_properties", 00:06:25.037 "bdev_ftl_get_stats", 00:06:25.037 "bdev_ftl_unmap", 00:06:25.037 "bdev_ftl_unload", 00:06:25.037 "bdev_ftl_delete", 00:06:25.037 "bdev_ftl_load", 00:06:25.037 "bdev_ftl_create", 00:06:25.037 "bdev_aio_delete", 00:06:25.037 "bdev_aio_rescan", 00:06:25.037 "bdev_aio_create", 00:06:25.037 "blobfs_create", 00:06:25.037 "blobfs_detect", 00:06:25.037 "blobfs_set_cache_size", 00:06:25.037 "bdev_zone_block_delete", 00:06:25.037 "bdev_zone_block_create", 00:06:25.037 "bdev_delay_delete", 00:06:25.037 "bdev_delay_create", 00:06:25.037 "bdev_delay_update_latency", 00:06:25.037 "bdev_split_delete", 00:06:25.037 "bdev_split_create", 00:06:25.037 "bdev_error_inject_error", 00:06:25.037 "bdev_error_delete", 00:06:25.037 "bdev_error_create", 00:06:25.037 "bdev_raid_set_options", 00:06:25.037 "bdev_raid_remove_base_bdev", 00:06:25.037 "bdev_raid_add_base_bdev", 00:06:25.037 "bdev_raid_delete", 00:06:25.037 "bdev_raid_create", 00:06:25.037 "bdev_raid_get_bdevs", 00:06:25.037 "bdev_lvol_set_parent_bdev", 00:06:25.037 "bdev_lvol_set_parent", 00:06:25.037 "bdev_lvol_check_shallow_copy", 00:06:25.037 "bdev_lvol_start_shallow_copy", 00:06:25.037 "bdev_lvol_grow_lvstore", 00:06:25.037 "bdev_lvol_get_lvols", 00:06:25.037 "bdev_lvol_get_lvstores", 00:06:25.037 "bdev_lvol_delete", 00:06:25.037 "bdev_lvol_set_read_only", 00:06:25.037 "bdev_lvol_resize", 00:06:25.037 "bdev_lvol_decouple_parent", 00:06:25.037 "bdev_lvol_inflate", 00:06:25.037 "bdev_lvol_rename", 00:06:25.037 "bdev_lvol_clone_bdev", 00:06:25.037 "bdev_lvol_clone", 00:06:25.037 "bdev_lvol_snapshot", 00:06:25.037 "bdev_lvol_create", 00:06:25.037 "bdev_lvol_delete_lvstore", 00:06:25.037 "bdev_lvol_rename_lvstore", 00:06:25.037 "bdev_lvol_create_lvstore", 00:06:25.037 "bdev_passthru_delete", 00:06:25.037 "bdev_passthru_create", 00:06:25.037 "bdev_nvme_cuse_unregister", 00:06:25.037 "bdev_nvme_cuse_register", 00:06:25.037 "bdev_opal_new_user", 00:06:25.037 "bdev_opal_set_lock_state", 00:06:25.037 "bdev_opal_delete", 00:06:25.037 "bdev_opal_get_info", 00:06:25.037 "bdev_opal_create", 00:06:25.037 "bdev_nvme_opal_revert", 00:06:25.037 "bdev_nvme_opal_init", 00:06:25.037 "bdev_nvme_send_cmd", 00:06:25.037 "bdev_nvme_set_keys", 00:06:25.037 "bdev_nvme_get_path_iostat", 00:06:25.037 "bdev_nvme_get_mdns_discovery_info", 00:06:25.037 "bdev_nvme_stop_mdns_discovery", 00:06:25.037 "bdev_nvme_start_mdns_discovery", 00:06:25.037 "bdev_nvme_set_multipath_policy", 00:06:25.037 "bdev_nvme_set_preferred_path", 00:06:25.037 "bdev_nvme_get_io_paths", 00:06:25.037 "bdev_nvme_remove_error_injection", 00:06:25.037 "bdev_nvme_add_error_injection", 00:06:25.037 "bdev_nvme_get_discovery_info", 00:06:25.037 "bdev_nvme_stop_discovery", 00:06:25.037 "bdev_nvme_start_discovery", 00:06:25.037 "bdev_nvme_get_controller_health_info", 00:06:25.037 "bdev_nvme_disable_controller", 00:06:25.037 "bdev_nvme_enable_controller", 00:06:25.037 "bdev_nvme_reset_controller", 00:06:25.037 "bdev_nvme_get_transport_statistics", 00:06:25.037 "bdev_nvme_apply_firmware", 00:06:25.037 "bdev_nvme_detach_controller", 00:06:25.037 "bdev_nvme_get_controllers", 00:06:25.037 "bdev_nvme_attach_controller", 00:06:25.037 "bdev_nvme_set_hotplug", 00:06:25.037 "bdev_nvme_set_options", 00:06:25.037 "bdev_null_resize", 00:06:25.037 "bdev_null_delete", 00:06:25.037 "bdev_null_create", 00:06:25.037 "bdev_malloc_delete", 00:06:25.037 "bdev_malloc_create" 00:06:25.037 ] 00:06:25.037 08:18:38 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:25.037 08:18:38 spdkcli_tcp -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:25.037 08:18:38 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:25.037 08:18:38 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:25.037 08:18:38 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 979947 00:06:25.037 08:18:38 spdkcli_tcp -- common/autotest_common.sh@950 -- # '[' -z 979947 ']' 00:06:25.037 08:18:38 spdkcli_tcp -- common/autotest_common.sh@954 -- # kill -0 979947 00:06:25.037 08:18:38 spdkcli_tcp -- common/autotest_common.sh@955 -- # uname 00:06:25.037 08:18:38 spdkcli_tcp -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:25.037 08:18:38 spdkcli_tcp -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 979947 00:06:25.037 08:18:38 spdkcli_tcp -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:25.037 08:18:38 spdkcli_tcp -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:25.037 08:18:38 spdkcli_tcp -- common/autotest_common.sh@968 -- # echo 'killing process with pid 979947' 00:06:25.037 killing process with pid 979947 00:06:25.037 08:18:38 spdkcli_tcp -- common/autotest_common.sh@969 -- # kill 979947 00:06:25.037 08:18:38 spdkcli_tcp -- common/autotest_common.sh@974 -- # wait 979947 00:06:25.608 00:06:25.608 real 0m1.117s 00:06:25.608 user 0m1.852s 00:06:25.608 sys 0m0.490s 00:06:25.608 08:18:38 spdkcli_tcp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:25.608 08:18:38 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:25.608 ************************************ 00:06:25.608 END TEST spdkcli_tcp 00:06:25.608 ************************************ 00:06:25.608 08:18:38 -- spdk/autotest.sh@167 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:25.608 08:18:38 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:25.608 08:18:38 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:25.608 08:18:38 -- common/autotest_common.sh@10 -- # set +x 00:06:25.608 ************************************ 00:06:25.608 START TEST dpdk_mem_utility 00:06:25.608 ************************************ 00:06:25.608 08:18:38 dpdk_mem_utility -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:25.608 * Looking for test storage... 00:06:25.608 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:06:25.608 08:18:38 dpdk_mem_utility -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:25.608 08:18:38 dpdk_mem_utility -- common/autotest_common.sh@1681 -- # lcov --version 00:06:25.608 08:18:38 dpdk_mem_utility -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:25.608 08:18:38 dpdk_mem_utility -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@336 -- # IFS=.-: 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@336 -- # read -ra ver1 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@337 -- # IFS=.-: 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@337 -- # read -ra ver2 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@338 -- # local 'op=<' 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@340 -- # ver1_l=2 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@341 -- # ver2_l=1 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@344 -- # case "$op" in 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@345 -- # : 1 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@365 -- # decimal 1 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=1 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 1 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@365 -- # ver1[v]=1 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@366 -- # decimal 2 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=2 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 2 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@366 -- # ver2[v]=2 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:25.608 08:18:38 dpdk_mem_utility -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:25.609 08:18:38 dpdk_mem_utility -- scripts/common.sh@368 -- # return 0 00:06:25.609 08:18:38 dpdk_mem_utility -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:25.609 08:18:38 dpdk_mem_utility -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:25.609 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:25.609 --rc genhtml_branch_coverage=1 00:06:25.609 --rc genhtml_function_coverage=1 00:06:25.609 --rc genhtml_legend=1 00:06:25.609 --rc geninfo_all_blocks=1 00:06:25.609 --rc geninfo_unexecuted_blocks=1 00:06:25.609 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:25.609 ' 00:06:25.609 08:18:38 dpdk_mem_utility -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:25.609 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:25.609 --rc genhtml_branch_coverage=1 00:06:25.609 --rc genhtml_function_coverage=1 00:06:25.609 --rc genhtml_legend=1 00:06:25.609 --rc geninfo_all_blocks=1 00:06:25.609 --rc geninfo_unexecuted_blocks=1 00:06:25.609 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:25.609 ' 00:06:25.609 08:18:38 dpdk_mem_utility -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:25.609 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:25.609 --rc genhtml_branch_coverage=1 00:06:25.609 --rc genhtml_function_coverage=1 00:06:25.609 --rc genhtml_legend=1 00:06:25.609 --rc geninfo_all_blocks=1 00:06:25.609 --rc geninfo_unexecuted_blocks=1 00:06:25.609 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:25.609 ' 00:06:25.609 08:18:38 dpdk_mem_utility -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:25.609 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:25.609 --rc genhtml_branch_coverage=1 00:06:25.609 --rc genhtml_function_coverage=1 00:06:25.609 --rc genhtml_legend=1 00:06:25.609 --rc geninfo_all_blocks=1 00:06:25.609 --rc geninfo_unexecuted_blocks=1 00:06:25.609 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:25.609 ' 00:06:25.609 08:18:38 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:25.609 08:18:38 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=980289 00:06:25.609 08:18:38 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 980289 00:06:25.609 08:18:38 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:25.609 08:18:38 dpdk_mem_utility -- common/autotest_common.sh@831 -- # '[' -z 980289 ']' 00:06:25.609 08:18:38 dpdk_mem_utility -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:25.609 08:18:38 dpdk_mem_utility -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:25.609 08:18:38 dpdk_mem_utility -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:25.609 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:25.609 08:18:38 dpdk_mem_utility -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:25.609 08:18:38 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:25.869 [2024-11-17 08:18:38.758852] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:25.869 [2024-11-17 08:18:38.758918] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid980289 ] 00:06:25.869 [2024-11-17 08:18:38.824760] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:25.869 [2024-11-17 08:18:38.864147] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.130 08:18:39 dpdk_mem_utility -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:26.130 08:18:39 dpdk_mem_utility -- common/autotest_common.sh@864 -- # return 0 00:06:26.130 08:18:39 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:26.130 08:18:39 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:26.130 08:18:39 dpdk_mem_utility -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:26.130 08:18:39 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:26.130 { 00:06:26.130 "filename": "/tmp/spdk_mem_dump.txt" 00:06:26.130 } 00:06:26.130 08:18:39 dpdk_mem_utility -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:26.130 08:18:39 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:26.130 DPDK memory size 860.000000 MiB in 1 heap(s) 00:06:26.130 1 heaps totaling size 860.000000 MiB 00:06:26.130 size: 860.000000 MiB heap id: 0 00:06:26.130 end heaps---------- 00:06:26.130 9 mempools totaling size 642.649841 MiB 00:06:26.130 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:26.130 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:26.130 size: 92.545471 MiB name: bdev_io_980289 00:06:26.130 size: 51.011292 MiB name: evtpool_980289 00:06:26.130 size: 50.003479 MiB name: msgpool_980289 00:06:26.130 size: 36.509338 MiB name: fsdev_io_980289 00:06:26.130 size: 21.763794 MiB name: PDU_Pool 00:06:26.130 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:26.130 size: 0.026123 MiB name: Session_Pool 00:06:26.130 end mempools------- 00:06:26.130 6 memzones totaling size 4.142822 MiB 00:06:26.130 size: 1.000366 MiB name: RG_ring_0_980289 00:06:26.130 size: 1.000366 MiB name: RG_ring_1_980289 00:06:26.130 size: 1.000366 MiB name: RG_ring_4_980289 00:06:26.130 size: 1.000366 MiB name: RG_ring_5_980289 00:06:26.130 size: 0.125366 MiB name: RG_ring_2_980289 00:06:26.130 size: 0.015991 MiB name: RG_ring_3_980289 00:06:26.130 end memzones------- 00:06:26.130 08:18:39 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:26.130 heap id: 0 total size: 860.000000 MiB number of busy elements: 44 number of free elements: 16 00:06:26.130 list of free elements. size: 13.984680 MiB 00:06:26.130 element at address: 0x200000400000 with size: 1.999512 MiB 00:06:26.130 element at address: 0x200000800000 with size: 1.996948 MiB 00:06:26.130 element at address: 0x20001bc00000 with size: 0.999878 MiB 00:06:26.130 element at address: 0x20001be00000 with size: 0.999878 MiB 00:06:26.130 element at address: 0x200034a00000 with size: 0.994446 MiB 00:06:26.130 element at address: 0x20000b200000 with size: 0.959839 MiB 00:06:26.130 element at address: 0x200015e00000 with size: 0.954285 MiB 00:06:26.130 element at address: 0x20001c000000 with size: 0.936584 MiB 00:06:26.130 element at address: 0x200000200000 with size: 0.841614 MiB 00:06:26.130 element at address: 0x20001d800000 with size: 0.582886 MiB 00:06:26.130 element at address: 0x200003e00000 with size: 0.495605 MiB 00:06:26.130 element at address: 0x200007000000 with size: 0.490723 MiB 00:06:26.130 element at address: 0x20001c200000 with size: 0.485657 MiB 00:06:26.130 element at address: 0x200013800000 with size: 0.481934 MiB 00:06:26.130 element at address: 0x20002ac00000 with size: 0.410034 MiB 00:06:26.130 element at address: 0x200003a00000 with size: 0.354858 MiB 00:06:26.130 list of standard malloc elements. size: 199.218628 MiB 00:06:26.130 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:06:26.130 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:06:26.130 element at address: 0x20001bcfff80 with size: 1.000122 MiB 00:06:26.130 element at address: 0x20001befff80 with size: 1.000122 MiB 00:06:26.130 element at address: 0x20001c0fff80 with size: 1.000122 MiB 00:06:26.130 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:06:26.130 element at address: 0x20001c0eff00 with size: 0.062622 MiB 00:06:26.130 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:26.130 element at address: 0x20001c0efdc0 with size: 0.000305 MiB 00:06:26.130 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:06:26.130 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:06:26.130 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:06:26.130 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:06:26.130 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:06:26.130 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:06:26.130 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:06:26.130 element at address: 0x200003a5ad80 with size: 0.000183 MiB 00:06:26.130 element at address: 0x200003a5af80 with size: 0.000183 MiB 00:06:26.130 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:06:26.130 element at address: 0x200003adb300 with size: 0.000183 MiB 00:06:26.130 element at address: 0x200003adb500 with size: 0.000183 MiB 00:06:26.130 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:06:26.130 element at address: 0x200003affa80 with size: 0.000183 MiB 00:06:26.130 element at address: 0x200003affb40 with size: 0.000183 MiB 00:06:26.130 element at address: 0x200003e7ee00 with size: 0.000183 MiB 00:06:26.130 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:06:26.130 element at address: 0x20000707da00 with size: 0.000183 MiB 00:06:26.130 element at address: 0x20000707dac0 with size: 0.000183 MiB 00:06:26.130 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:06:26.130 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:06:26.130 element at address: 0x20001387b600 with size: 0.000183 MiB 00:06:26.130 element at address: 0x20001387b6c0 with size: 0.000183 MiB 00:06:26.130 element at address: 0x2000138fb980 with size: 0.000183 MiB 00:06:26.130 element at address: 0x200015ef44c0 with size: 0.000183 MiB 00:06:26.130 element at address: 0x20001c0efc40 with size: 0.000183 MiB 00:06:26.131 element at address: 0x20001c0efd00 with size: 0.000183 MiB 00:06:26.131 element at address: 0x20001c2bc740 with size: 0.000183 MiB 00:06:26.131 element at address: 0x20001d895380 with size: 0.000183 MiB 00:06:26.131 element at address: 0x20001d895440 with size: 0.000183 MiB 00:06:26.131 element at address: 0x20002ac68f80 with size: 0.000183 MiB 00:06:26.131 element at address: 0x20002ac69040 with size: 0.000183 MiB 00:06:26.131 element at address: 0x20002ac6fc40 with size: 0.000183 MiB 00:06:26.131 element at address: 0x20002ac6fe40 with size: 0.000183 MiB 00:06:26.131 element at address: 0x20002ac6ff00 with size: 0.000183 MiB 00:06:26.131 list of memzone associated elements. size: 646.796692 MiB 00:06:26.131 element at address: 0x20001d895500 with size: 211.416748 MiB 00:06:26.131 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:26.131 element at address: 0x20002ac6ffc0 with size: 157.562561 MiB 00:06:26.131 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:26.131 element at address: 0x200015ff4780 with size: 92.045044 MiB 00:06:26.131 associated memzone info: size: 92.044922 MiB name: MP_bdev_io_980289_0 00:06:26.131 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:06:26.131 associated memzone info: size: 48.002930 MiB name: MP_evtpool_980289_0 00:06:26.131 element at address: 0x200003fff380 with size: 48.003052 MiB 00:06:26.131 associated memzone info: size: 48.002930 MiB name: MP_msgpool_980289_0 00:06:26.131 element at address: 0x2000139fdb80 with size: 36.008911 MiB 00:06:26.131 associated memzone info: size: 36.008789 MiB name: MP_fsdev_io_980289_0 00:06:26.131 element at address: 0x20001c3be940 with size: 20.255554 MiB 00:06:26.131 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:26.131 element at address: 0x200034bfeb40 with size: 18.005066 MiB 00:06:26.131 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:26.131 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:06:26.131 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_980289 00:06:26.131 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:06:26.131 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_980289 00:06:26.131 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:06:26.131 associated memzone info: size: 1.007996 MiB name: MP_evtpool_980289 00:06:26.131 element at address: 0x2000138fba40 with size: 1.008118 MiB 00:06:26.131 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:26.131 element at address: 0x20001c2bc800 with size: 1.008118 MiB 00:06:26.131 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:26.131 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:06:26.131 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:26.131 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:06:26.131 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:26.131 element at address: 0x200003eff180 with size: 1.000488 MiB 00:06:26.131 associated memzone info: size: 1.000366 MiB name: RG_ring_0_980289 00:06:26.131 element at address: 0x200003affc00 with size: 1.000488 MiB 00:06:26.131 associated memzone info: size: 1.000366 MiB name: RG_ring_1_980289 00:06:26.131 element at address: 0x200015ef4580 with size: 1.000488 MiB 00:06:26.131 associated memzone info: size: 1.000366 MiB name: RG_ring_4_980289 00:06:26.131 element at address: 0x200034afe940 with size: 1.000488 MiB 00:06:26.131 associated memzone info: size: 1.000366 MiB name: RG_ring_5_980289 00:06:26.131 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:06:26.131 associated memzone info: size: 0.500366 MiB name: RG_MP_fsdev_io_980289 00:06:26.131 element at address: 0x200003e7eec0 with size: 0.500488 MiB 00:06:26.131 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_980289 00:06:26.131 element at address: 0x20001387b780 with size: 0.500488 MiB 00:06:26.131 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:26.131 element at address: 0x20000707db80 with size: 0.500488 MiB 00:06:26.131 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:26.131 element at address: 0x20001c27c540 with size: 0.250488 MiB 00:06:26.131 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:26.131 element at address: 0x200003adf880 with size: 0.125488 MiB 00:06:26.131 associated memzone info: size: 0.125366 MiB name: RG_ring_2_980289 00:06:26.131 element at address: 0x20000b2f5b80 with size: 0.031738 MiB 00:06:26.131 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:26.131 element at address: 0x20002ac69100 with size: 0.023743 MiB 00:06:26.131 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:26.131 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:06:26.131 associated memzone info: size: 0.015991 MiB name: RG_ring_3_980289 00:06:26.131 element at address: 0x20002ac6f240 with size: 0.002441 MiB 00:06:26.131 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:26.131 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:06:26.131 associated memzone info: size: 0.000183 MiB name: MP_msgpool_980289 00:06:26.131 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:06:26.131 associated memzone info: size: 0.000183 MiB name: MP_fsdev_io_980289 00:06:26.131 element at address: 0x200003a5ae40 with size: 0.000305 MiB 00:06:26.131 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_980289 00:06:26.131 element at address: 0x20002ac6fd00 with size: 0.000305 MiB 00:06:26.131 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:26.131 08:18:39 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:26.131 08:18:39 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 980289 00:06:26.131 08:18:39 dpdk_mem_utility -- common/autotest_common.sh@950 -- # '[' -z 980289 ']' 00:06:26.131 08:18:39 dpdk_mem_utility -- common/autotest_common.sh@954 -- # kill -0 980289 00:06:26.131 08:18:39 dpdk_mem_utility -- common/autotest_common.sh@955 -- # uname 00:06:26.131 08:18:39 dpdk_mem_utility -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:26.131 08:18:39 dpdk_mem_utility -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 980289 00:06:26.131 08:18:39 dpdk_mem_utility -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:26.131 08:18:39 dpdk_mem_utility -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:26.131 08:18:39 dpdk_mem_utility -- common/autotest_common.sh@968 -- # echo 'killing process with pid 980289' 00:06:26.131 killing process with pid 980289 00:06:26.131 08:18:39 dpdk_mem_utility -- common/autotest_common.sh@969 -- # kill 980289 00:06:26.131 08:18:39 dpdk_mem_utility -- common/autotest_common.sh@974 -- # wait 980289 00:06:26.700 00:06:26.700 real 0m1.003s 00:06:26.700 user 0m0.912s 00:06:26.700 sys 0m0.450s 00:06:26.700 08:18:39 dpdk_mem_utility -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:26.700 08:18:39 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:26.700 ************************************ 00:06:26.700 END TEST dpdk_mem_utility 00:06:26.700 ************************************ 00:06:26.701 08:18:39 -- spdk/autotest.sh@168 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:26.701 08:18:39 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:26.701 08:18:39 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:26.701 08:18:39 -- common/autotest_common.sh@10 -- # set +x 00:06:26.701 ************************************ 00:06:26.701 START TEST event 00:06:26.701 ************************************ 00:06:26.701 08:18:39 event -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:26.701 * Looking for test storage... 00:06:26.701 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:26.701 08:18:39 event -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:26.701 08:18:39 event -- common/autotest_common.sh@1681 -- # lcov --version 00:06:26.701 08:18:39 event -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:26.701 08:18:39 event -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:26.701 08:18:39 event -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:26.701 08:18:39 event -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:26.701 08:18:39 event -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:26.701 08:18:39 event -- scripts/common.sh@336 -- # IFS=.-: 00:06:26.701 08:18:39 event -- scripts/common.sh@336 -- # read -ra ver1 00:06:26.701 08:18:39 event -- scripts/common.sh@337 -- # IFS=.-: 00:06:26.701 08:18:39 event -- scripts/common.sh@337 -- # read -ra ver2 00:06:26.701 08:18:39 event -- scripts/common.sh@338 -- # local 'op=<' 00:06:26.701 08:18:39 event -- scripts/common.sh@340 -- # ver1_l=2 00:06:26.701 08:18:39 event -- scripts/common.sh@341 -- # ver2_l=1 00:06:26.701 08:18:39 event -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:26.701 08:18:39 event -- scripts/common.sh@344 -- # case "$op" in 00:06:26.701 08:18:39 event -- scripts/common.sh@345 -- # : 1 00:06:26.701 08:18:39 event -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:26.701 08:18:39 event -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:26.701 08:18:39 event -- scripts/common.sh@365 -- # decimal 1 00:06:26.701 08:18:39 event -- scripts/common.sh@353 -- # local d=1 00:06:26.701 08:18:39 event -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:26.701 08:18:39 event -- scripts/common.sh@355 -- # echo 1 00:06:26.701 08:18:39 event -- scripts/common.sh@365 -- # ver1[v]=1 00:06:26.701 08:18:39 event -- scripts/common.sh@366 -- # decimal 2 00:06:26.701 08:18:39 event -- scripts/common.sh@353 -- # local d=2 00:06:26.701 08:18:39 event -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:26.701 08:18:39 event -- scripts/common.sh@355 -- # echo 2 00:06:26.701 08:18:39 event -- scripts/common.sh@366 -- # ver2[v]=2 00:06:26.701 08:18:39 event -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:26.701 08:18:39 event -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:26.701 08:18:39 event -- scripts/common.sh@368 -- # return 0 00:06:26.701 08:18:39 event -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:26.701 08:18:39 event -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:26.701 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:26.701 --rc genhtml_branch_coverage=1 00:06:26.701 --rc genhtml_function_coverage=1 00:06:26.701 --rc genhtml_legend=1 00:06:26.701 --rc geninfo_all_blocks=1 00:06:26.701 --rc geninfo_unexecuted_blocks=1 00:06:26.701 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:26.701 ' 00:06:26.701 08:18:39 event -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:26.701 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:26.701 --rc genhtml_branch_coverage=1 00:06:26.701 --rc genhtml_function_coverage=1 00:06:26.701 --rc genhtml_legend=1 00:06:26.701 --rc geninfo_all_blocks=1 00:06:26.701 --rc geninfo_unexecuted_blocks=1 00:06:26.701 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:26.701 ' 00:06:26.701 08:18:39 event -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:26.701 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:26.701 --rc genhtml_branch_coverage=1 00:06:26.701 --rc genhtml_function_coverage=1 00:06:26.701 --rc genhtml_legend=1 00:06:26.701 --rc geninfo_all_blocks=1 00:06:26.701 --rc geninfo_unexecuted_blocks=1 00:06:26.701 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:26.701 ' 00:06:26.701 08:18:39 event -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:26.701 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:26.701 --rc genhtml_branch_coverage=1 00:06:26.701 --rc genhtml_function_coverage=1 00:06:26.701 --rc genhtml_legend=1 00:06:26.701 --rc geninfo_all_blocks=1 00:06:26.701 --rc geninfo_unexecuted_blocks=1 00:06:26.701 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:26.701 ' 00:06:26.701 08:18:39 event -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:26.701 08:18:39 event -- bdev/nbd_common.sh@6 -- # set -e 00:06:26.701 08:18:39 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:26.701 08:18:39 event -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:06:26.701 08:18:39 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:26.701 08:18:39 event -- common/autotest_common.sh@10 -- # set +x 00:06:26.961 ************************************ 00:06:26.961 START TEST event_perf 00:06:26.961 ************************************ 00:06:26.961 08:18:39 event.event_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:26.961 Running I/O for 1 seconds...[2024-11-17 08:18:39.876668] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:26.961 [2024-11-17 08:18:39.876773] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid980516 ] 00:06:26.961 [2024-11-17 08:18:39.946517] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:26.961 [2024-11-17 08:18:39.987722] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:26.961 [2024-11-17 08:18:39.987781] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:06:26.961 [2024-11-17 08:18:39.987862] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:06:26.961 [2024-11-17 08:18:39.987864] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.342 Running I/O for 1 seconds... 00:06:28.342 lcore 0: 196147 00:06:28.342 lcore 1: 196149 00:06:28.342 lcore 2: 196146 00:06:28.342 lcore 3: 196148 00:06:28.342 done. 00:06:28.342 00:06:28.342 real 0m1.190s 00:06:28.342 user 0m4.093s 00:06:28.342 sys 0m0.094s 00:06:28.342 08:18:41 event.event_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:28.342 08:18:41 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:06:28.342 ************************************ 00:06:28.342 END TEST event_perf 00:06:28.342 ************************************ 00:06:28.342 08:18:41 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:28.342 08:18:41 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:06:28.342 08:18:41 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:28.342 08:18:41 event -- common/autotest_common.sh@10 -- # set +x 00:06:28.342 ************************************ 00:06:28.342 START TEST event_reactor 00:06:28.342 ************************************ 00:06:28.342 08:18:41 event.event_reactor -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:28.342 [2024-11-17 08:18:41.148571] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:28.342 [2024-11-17 08:18:41.148653] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid980672 ] 00:06:28.342 [2024-11-17 08:18:41.218858] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:28.342 [2024-11-17 08:18:41.256538] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.282 test_start 00:06:29.282 oneshot 00:06:29.282 tick 100 00:06:29.282 tick 100 00:06:29.282 tick 250 00:06:29.282 tick 100 00:06:29.282 tick 100 00:06:29.282 tick 100 00:06:29.282 tick 250 00:06:29.282 tick 500 00:06:29.282 tick 100 00:06:29.282 tick 100 00:06:29.282 tick 250 00:06:29.282 tick 100 00:06:29.282 tick 100 00:06:29.282 test_end 00:06:29.282 00:06:29.282 real 0m1.181s 00:06:29.282 user 0m1.085s 00:06:29.282 sys 0m0.092s 00:06:29.282 08:18:42 event.event_reactor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:29.282 08:18:42 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:06:29.282 ************************************ 00:06:29.282 END TEST event_reactor 00:06:29.282 ************************************ 00:06:29.282 08:18:42 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:29.282 08:18:42 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:06:29.282 08:18:42 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:29.282 08:18:42 event -- common/autotest_common.sh@10 -- # set +x 00:06:29.282 ************************************ 00:06:29.282 START TEST event_reactor_perf 00:06:29.282 ************************************ 00:06:29.282 08:18:42 event.event_reactor_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:29.282 [2024-11-17 08:18:42.414231] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:29.282 [2024-11-17 08:18:42.414311] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid980941 ] 00:06:29.542 [2024-11-17 08:18:42.484562] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:29.542 [2024-11-17 08:18:42.522012] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:30.481 test_start 00:06:30.481 test_end 00:06:30.481 Performance: 910264 events per second 00:06:30.481 00:06:30.481 real 0m1.184s 00:06:30.481 user 0m1.093s 00:06:30.481 sys 0m0.087s 00:06:30.481 08:18:43 event.event_reactor_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:30.481 08:18:43 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:06:30.481 ************************************ 00:06:30.481 END TEST event_reactor_perf 00:06:30.481 ************************************ 00:06:30.741 08:18:43 event -- event/event.sh@49 -- # uname -s 00:06:30.741 08:18:43 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:30.741 08:18:43 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:30.741 08:18:43 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:30.741 08:18:43 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:30.741 08:18:43 event -- common/autotest_common.sh@10 -- # set +x 00:06:30.741 ************************************ 00:06:30.741 START TEST event_scheduler 00:06:30.741 ************************************ 00:06:30.741 08:18:43 event.event_scheduler -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:30.741 * Looking for test storage... 00:06:30.741 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:06:30.741 08:18:43 event.event_scheduler -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:30.741 08:18:43 event.event_scheduler -- common/autotest_common.sh@1681 -- # lcov --version 00:06:30.741 08:18:43 event.event_scheduler -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:30.741 08:18:43 event.event_scheduler -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:30.741 08:18:43 event.event_scheduler -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:30.741 08:18:43 event.event_scheduler -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:30.741 08:18:43 event.event_scheduler -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:30.741 08:18:43 event.event_scheduler -- scripts/common.sh@336 -- # IFS=.-: 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@336 -- # read -ra ver1 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@337 -- # IFS=.-: 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@337 -- # read -ra ver2 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@338 -- # local 'op=<' 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@340 -- # ver1_l=2 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@341 -- # ver2_l=1 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@344 -- # case "$op" in 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@345 -- # : 1 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@365 -- # decimal 1 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@353 -- # local d=1 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@355 -- # echo 1 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@365 -- # ver1[v]=1 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@366 -- # decimal 2 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@353 -- # local d=2 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@355 -- # echo 2 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@366 -- # ver2[v]=2 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:30.742 08:18:43 event.event_scheduler -- scripts/common.sh@368 -- # return 0 00:06:30.742 08:18:43 event.event_scheduler -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:30.742 08:18:43 event.event_scheduler -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:30.742 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.742 --rc genhtml_branch_coverage=1 00:06:30.742 --rc genhtml_function_coverage=1 00:06:30.742 --rc genhtml_legend=1 00:06:30.742 --rc geninfo_all_blocks=1 00:06:30.742 --rc geninfo_unexecuted_blocks=1 00:06:30.742 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.742 ' 00:06:30.742 08:18:43 event.event_scheduler -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:30.742 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.742 --rc genhtml_branch_coverage=1 00:06:30.742 --rc genhtml_function_coverage=1 00:06:30.742 --rc genhtml_legend=1 00:06:30.742 --rc geninfo_all_blocks=1 00:06:30.742 --rc geninfo_unexecuted_blocks=1 00:06:30.742 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.742 ' 00:06:30.742 08:18:43 event.event_scheduler -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:30.742 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.742 --rc genhtml_branch_coverage=1 00:06:30.742 --rc genhtml_function_coverage=1 00:06:30.742 --rc genhtml_legend=1 00:06:30.742 --rc geninfo_all_blocks=1 00:06:30.742 --rc geninfo_unexecuted_blocks=1 00:06:30.742 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.742 ' 00:06:30.742 08:18:43 event.event_scheduler -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:30.742 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.742 --rc genhtml_branch_coverage=1 00:06:30.742 --rc genhtml_function_coverage=1 00:06:30.742 --rc genhtml_legend=1 00:06:30.742 --rc geninfo_all_blocks=1 00:06:30.742 --rc geninfo_unexecuted_blocks=1 00:06:30.742 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.742 ' 00:06:30.742 08:18:43 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:30.742 08:18:43 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=981262 00:06:30.742 08:18:43 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:30.742 08:18:43 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:30.742 08:18:43 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 981262 00:06:30.742 08:18:43 event.event_scheduler -- common/autotest_common.sh@831 -- # '[' -z 981262 ']' 00:06:30.742 08:18:43 event.event_scheduler -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:30.742 08:18:43 event.event_scheduler -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:30.742 08:18:43 event.event_scheduler -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:30.742 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:30.742 08:18:43 event.event_scheduler -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:30.742 08:18:43 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:31.002 [2024-11-17 08:18:43.895596] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:31.002 [2024-11-17 08:18:43.895683] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid981262 ] 00:06:31.002 [2024-11-17 08:18:43.961495] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:31.002 [2024-11-17 08:18:44.003411] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.002 [2024-11-17 08:18:44.003493] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:31.002 [2024-11-17 08:18:44.003582] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:06:31.002 [2024-11-17 08:18:44.003584] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:06:31.002 08:18:44 event.event_scheduler -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:31.002 08:18:44 event.event_scheduler -- common/autotest_common.sh@864 -- # return 0 00:06:31.002 08:18:44 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:31.002 08:18:44 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.002 08:18:44 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:31.002 [2024-11-17 08:18:44.068253] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:06:31.002 [2024-11-17 08:18:44.068275] scheduler_dynamic.c: 280:init: *NOTICE*: Unable to initialize dpdk governor 00:06:31.002 [2024-11-17 08:18:44.068286] scheduler_dynamic.c: 427:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:06:31.002 [2024-11-17 08:18:44.068294] scheduler_dynamic.c: 429:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:06:31.002 [2024-11-17 08:18:44.068301] scheduler_dynamic.c: 431:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:06:31.002 08:18:44 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.002 08:18:44 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:31.002 08:18:44 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.002 08:18:44 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:31.002 [2024-11-17 08:18:44.135455] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:31.002 08:18:44 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.002 08:18:44 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:31.002 08:18:44 event.event_scheduler -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:31.002 08:18:44 event.event_scheduler -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:31.002 08:18:44 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:31.262 ************************************ 00:06:31.262 START TEST scheduler_create_thread 00:06:31.262 ************************************ 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1125 -- # scheduler_create_thread 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:31.262 2 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:31.262 3 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:31.262 4 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:31.262 5 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:31.262 6 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:31.262 7 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:31.262 8 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:31.262 9 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:31.262 10 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.262 08:18:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:31.263 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.263 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:31.831 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.831 08:18:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:31.831 08:18:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:31.831 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.831 08:18:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:32.824 08:18:45 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:32.824 08:18:45 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:32.824 08:18:45 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:32.824 08:18:45 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:33.458 08:18:46 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:33.458 08:18:46 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:33.458 08:18:46 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:33.458 08:18:46 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:33.458 08:18:46 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:34.409 08:18:47 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:34.409 00:06:34.409 real 0m3.227s 00:06:34.409 user 0m0.024s 00:06:34.409 sys 0m0.008s 00:06:34.409 08:18:47 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:34.409 08:18:47 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:34.409 ************************************ 00:06:34.409 END TEST scheduler_create_thread 00:06:34.409 ************************************ 00:06:34.409 08:18:47 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:34.409 08:18:47 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 981262 00:06:34.409 08:18:47 event.event_scheduler -- common/autotest_common.sh@950 -- # '[' -z 981262 ']' 00:06:34.409 08:18:47 event.event_scheduler -- common/autotest_common.sh@954 -- # kill -0 981262 00:06:34.409 08:18:47 event.event_scheduler -- common/autotest_common.sh@955 -- # uname 00:06:34.409 08:18:47 event.event_scheduler -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:34.409 08:18:47 event.event_scheduler -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 981262 00:06:34.409 08:18:47 event.event_scheduler -- common/autotest_common.sh@956 -- # process_name=reactor_2 00:06:34.409 08:18:47 event.event_scheduler -- common/autotest_common.sh@960 -- # '[' reactor_2 = sudo ']' 00:06:34.409 08:18:47 event.event_scheduler -- common/autotest_common.sh@968 -- # echo 'killing process with pid 981262' 00:06:34.409 killing process with pid 981262 00:06:34.409 08:18:47 event.event_scheduler -- common/autotest_common.sh@969 -- # kill 981262 00:06:34.409 08:18:47 event.event_scheduler -- common/autotest_common.sh@974 -- # wait 981262 00:06:34.668 [2024-11-17 08:18:47.784517] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:34.927 00:06:34.927 real 0m4.351s 00:06:34.927 user 0m7.475s 00:06:34.927 sys 0m0.431s 00:06:34.927 08:18:48 event.event_scheduler -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:34.927 08:18:48 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:34.927 ************************************ 00:06:34.928 END TEST event_scheduler 00:06:34.928 ************************************ 00:06:35.188 08:18:48 event -- event/event.sh@51 -- # modprobe -n nbd 00:06:35.188 08:18:48 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:35.188 08:18:48 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:35.188 08:18:48 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:35.188 08:18:48 event -- common/autotest_common.sh@10 -- # set +x 00:06:35.188 ************************************ 00:06:35.188 START TEST app_repeat 00:06:35.188 ************************************ 00:06:35.188 08:18:48 event.app_repeat -- common/autotest_common.sh@1125 -- # app_repeat_test 00:06:35.188 08:18:48 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:35.188 08:18:48 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:35.188 08:18:48 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:06:35.188 08:18:48 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:35.188 08:18:48 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:06:35.188 08:18:48 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:06:35.188 08:18:48 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:06:35.188 08:18:48 event.app_repeat -- event/event.sh@19 -- # repeat_pid=982118 00:06:35.188 08:18:48 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:35.188 08:18:48 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:35.188 08:18:48 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 982118' 00:06:35.188 Process app_repeat pid: 982118 00:06:35.188 08:18:48 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:35.188 08:18:48 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:35.188 spdk_app_start Round 0 00:06:35.188 08:18:48 event.app_repeat -- event/event.sh@25 -- # waitforlisten 982118 /var/tmp/spdk-nbd.sock 00:06:35.188 08:18:48 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 982118 ']' 00:06:35.188 08:18:48 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:35.188 08:18:48 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:35.188 08:18:48 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:35.188 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:35.188 08:18:48 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:35.188 08:18:48 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:35.188 [2024-11-17 08:18:48.141472] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:35.188 [2024-11-17 08:18:48.141549] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid982118 ] 00:06:35.188 [2024-11-17 08:18:48.210366] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:35.188 [2024-11-17 08:18:48.250928] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:35.188 [2024-11-17 08:18:48.250931] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.447 08:18:48 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:35.447 08:18:48 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:06:35.447 08:18:48 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:35.447 Malloc0 00:06:35.447 08:18:48 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:35.706 Malloc1 00:06:35.706 08:18:48 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:35.706 08:18:48 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:35.706 08:18:48 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:35.706 08:18:48 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:35.706 08:18:48 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:35.706 08:18:48 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:35.706 08:18:48 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:35.706 08:18:48 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:35.706 08:18:48 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:35.707 08:18:48 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:35.707 08:18:48 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:35.707 08:18:48 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:35.707 08:18:48 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:35.707 08:18:48 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:35.707 08:18:48 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:35.707 08:18:48 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:35.967 /dev/nbd0 00:06:35.967 08:18:48 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:35.967 08:18:48 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:35.967 08:18:48 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:06:35.967 08:18:48 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:06:35.967 08:18:48 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:06:35.967 08:18:48 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:06:35.967 08:18:48 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:06:35.967 08:18:48 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:06:35.967 08:18:48 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:06:35.967 08:18:48 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:06:35.967 08:18:48 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:35.967 1+0 records in 00:06:35.967 1+0 records out 00:06:35.967 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000230123 s, 17.8 MB/s 00:06:35.967 08:18:48 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:35.967 08:18:48 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:06:35.967 08:18:48 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:35.967 08:18:49 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:06:35.967 08:18:49 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:06:35.967 08:18:49 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:35.967 08:18:49 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:35.967 08:18:49 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:36.226 /dev/nbd1 00:06:36.226 08:18:49 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:36.226 08:18:49 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:36.226 08:18:49 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:06:36.226 08:18:49 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:06:36.226 08:18:49 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:06:36.226 08:18:49 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:06:36.226 08:18:49 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:06:36.226 08:18:49 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:06:36.226 08:18:49 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:06:36.226 08:18:49 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:06:36.226 08:18:49 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:36.226 1+0 records in 00:06:36.226 1+0 records out 00:06:36.226 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000239507 s, 17.1 MB/s 00:06:36.226 08:18:49 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:36.226 08:18:49 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:06:36.226 08:18:49 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:36.226 08:18:49 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:06:36.226 08:18:49 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:06:36.226 08:18:49 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:36.226 08:18:49 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:36.226 08:18:49 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:36.226 08:18:49 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:36.226 08:18:49 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:36.485 { 00:06:36.485 "nbd_device": "/dev/nbd0", 00:06:36.485 "bdev_name": "Malloc0" 00:06:36.485 }, 00:06:36.485 { 00:06:36.485 "nbd_device": "/dev/nbd1", 00:06:36.485 "bdev_name": "Malloc1" 00:06:36.485 } 00:06:36.485 ]' 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:36.485 { 00:06:36.485 "nbd_device": "/dev/nbd0", 00:06:36.485 "bdev_name": "Malloc0" 00:06:36.485 }, 00:06:36.485 { 00:06:36.485 "nbd_device": "/dev/nbd1", 00:06:36.485 "bdev_name": "Malloc1" 00:06:36.485 } 00:06:36.485 ]' 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:36.485 /dev/nbd1' 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:36.485 /dev/nbd1' 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:36.485 256+0 records in 00:06:36.485 256+0 records out 00:06:36.485 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0102793 s, 102 MB/s 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:36.485 256+0 records in 00:06:36.485 256+0 records out 00:06:36.485 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.01942 s, 54.0 MB/s 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:36.485 256+0 records in 00:06:36.485 256+0 records out 00:06:36.485 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.020847 s, 50.3 MB/s 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:36.485 08:18:49 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:36.745 08:18:49 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:36.745 08:18:49 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:36.745 08:18:49 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:36.745 08:18:49 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:36.745 08:18:49 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:36.745 08:18:49 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:36.745 08:18:49 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:36.745 08:18:49 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:36.745 08:18:49 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:36.745 08:18:49 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:37.004 08:18:49 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:37.004 08:18:49 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:37.004 08:18:49 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:37.004 08:18:49 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:37.004 08:18:49 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:37.004 08:18:49 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:37.004 08:18:49 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:37.004 08:18:49 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:37.004 08:18:49 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:37.004 08:18:49 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:37.004 08:18:49 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:37.263 08:18:50 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:37.263 08:18:50 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:37.263 08:18:50 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:37.263 08:18:50 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:37.263 08:18:50 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:37.263 08:18:50 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:37.263 08:18:50 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:37.263 08:18:50 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:37.263 08:18:50 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:37.263 08:18:50 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:37.263 08:18:50 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:37.263 08:18:50 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:37.263 08:18:50 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:37.522 08:18:50 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:37.522 [2024-11-17 08:18:50.636367] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:37.781 [2024-11-17 08:18:50.671778] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:37.781 [2024-11-17 08:18:50.671780] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.781 [2024-11-17 08:18:50.712209] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:37.781 [2024-11-17 08:18:50.712252] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:41.074 08:18:53 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:41.074 08:18:53 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:41.074 spdk_app_start Round 1 00:06:41.074 08:18:53 event.app_repeat -- event/event.sh@25 -- # waitforlisten 982118 /var/tmp/spdk-nbd.sock 00:06:41.074 08:18:53 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 982118 ']' 00:06:41.074 08:18:53 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:41.074 08:18:53 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:41.074 08:18:53 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:41.074 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:41.074 08:18:53 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:41.074 08:18:53 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:41.074 08:18:53 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:41.074 08:18:53 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:06:41.074 08:18:53 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:41.074 Malloc0 00:06:41.074 08:18:53 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:41.074 Malloc1 00:06:41.074 08:18:54 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:41.074 08:18:54 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:41.074 08:18:54 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:41.074 08:18:54 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:41.074 08:18:54 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:41.074 08:18:54 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:41.074 08:18:54 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:41.074 08:18:54 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:41.074 08:18:54 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:41.074 08:18:54 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:41.074 08:18:54 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:41.074 08:18:54 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:41.074 08:18:54 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:41.074 08:18:54 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:41.074 08:18:54 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:41.074 08:18:54 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:41.333 /dev/nbd0 00:06:41.333 08:18:54 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:41.333 08:18:54 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:41.333 08:18:54 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:06:41.333 08:18:54 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:06:41.333 08:18:54 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:06:41.333 08:18:54 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:06:41.333 08:18:54 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:06:41.333 08:18:54 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:06:41.333 08:18:54 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:06:41.334 08:18:54 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:06:41.334 08:18:54 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:41.334 1+0 records in 00:06:41.334 1+0 records out 00:06:41.334 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000115045 s, 35.6 MB/s 00:06:41.334 08:18:54 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:41.334 08:18:54 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:06:41.334 08:18:54 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:41.334 08:18:54 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:06:41.334 08:18:54 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:06:41.334 08:18:54 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:41.334 08:18:54 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:41.334 08:18:54 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:41.593 /dev/nbd1 00:06:41.593 08:18:54 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:41.593 08:18:54 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:41.593 08:18:54 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:06:41.593 08:18:54 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:06:41.593 08:18:54 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:06:41.593 08:18:54 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:06:41.593 08:18:54 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:06:41.593 08:18:54 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:06:41.593 08:18:54 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:06:41.593 08:18:54 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:06:41.593 08:18:54 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:41.593 1+0 records in 00:06:41.593 1+0 records out 00:06:41.593 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000242076 s, 16.9 MB/s 00:06:41.593 08:18:54 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:41.593 08:18:54 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:06:41.593 08:18:54 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:41.593 08:18:54 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:06:41.593 08:18:54 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:06:41.593 08:18:54 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:41.593 08:18:54 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:41.593 08:18:54 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:41.593 08:18:54 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:41.593 08:18:54 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:41.853 { 00:06:41.853 "nbd_device": "/dev/nbd0", 00:06:41.853 "bdev_name": "Malloc0" 00:06:41.853 }, 00:06:41.853 { 00:06:41.853 "nbd_device": "/dev/nbd1", 00:06:41.853 "bdev_name": "Malloc1" 00:06:41.853 } 00:06:41.853 ]' 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:41.853 { 00:06:41.853 "nbd_device": "/dev/nbd0", 00:06:41.853 "bdev_name": "Malloc0" 00:06:41.853 }, 00:06:41.853 { 00:06:41.853 "nbd_device": "/dev/nbd1", 00:06:41.853 "bdev_name": "Malloc1" 00:06:41.853 } 00:06:41.853 ]' 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:41.853 /dev/nbd1' 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:41.853 /dev/nbd1' 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:41.853 256+0 records in 00:06:41.853 256+0 records out 00:06:41.853 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0110373 s, 95.0 MB/s 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:41.853 256+0 records in 00:06:41.853 256+0 records out 00:06:41.853 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0196513 s, 53.4 MB/s 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:41.853 256+0 records in 00:06:41.853 256+0 records out 00:06:41.853 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.020912 s, 50.1 MB/s 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:41.853 08:18:54 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:42.112 08:18:55 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:42.112 08:18:55 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:42.112 08:18:55 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:42.112 08:18:55 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:42.112 08:18:55 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:42.112 08:18:55 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:42.112 08:18:55 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:42.112 08:18:55 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:42.112 08:18:55 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:42.112 08:18:55 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:42.371 08:18:55 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:42.371 08:18:55 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:42.371 08:18:55 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:42.371 08:18:55 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:42.371 08:18:55 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:42.371 08:18:55 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:42.371 08:18:55 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:42.371 08:18:55 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:42.371 08:18:55 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:42.371 08:18:55 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:42.371 08:18:55 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:42.371 08:18:55 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:42.371 08:18:55 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:42.371 08:18:55 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:42.631 08:18:55 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:42.631 08:18:55 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:42.631 08:18:55 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:42.631 08:18:55 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:42.631 08:18:55 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:42.631 08:18:55 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:42.631 08:18:55 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:42.631 08:18:55 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:42.631 08:18:55 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:42.631 08:18:55 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:42.631 08:18:55 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:42.890 [2024-11-17 08:18:55.902783] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:42.890 [2024-11-17 08:18:55.936946] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:42.890 [2024-11-17 08:18:55.936948] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.890 [2024-11-17 08:18:55.978091] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:42.890 [2024-11-17 08:18:55.978135] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:46.178 08:18:58 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:46.178 08:18:58 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:46.178 spdk_app_start Round 2 00:06:46.178 08:18:58 event.app_repeat -- event/event.sh@25 -- # waitforlisten 982118 /var/tmp/spdk-nbd.sock 00:06:46.178 08:18:58 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 982118 ']' 00:06:46.178 08:18:58 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:46.178 08:18:58 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:46.178 08:18:58 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:46.178 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:46.178 08:18:58 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:46.178 08:18:58 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:46.178 08:18:58 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:46.178 08:18:58 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:06:46.178 08:18:58 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:46.178 Malloc0 00:06:46.178 08:18:59 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:46.178 Malloc1 00:06:46.178 08:18:59 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:46.178 08:18:59 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:46.178 08:18:59 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:46.178 08:18:59 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:46.178 08:18:59 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:46.178 08:18:59 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:46.178 08:18:59 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:46.178 08:18:59 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:46.178 08:18:59 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:46.178 08:18:59 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:46.178 08:18:59 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:46.178 08:18:59 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:46.178 08:18:59 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:46.178 08:18:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:46.178 08:18:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:46.178 08:18:59 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:46.436 /dev/nbd0 00:06:46.436 08:18:59 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:46.436 08:18:59 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:46.436 08:18:59 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:06:46.436 08:18:59 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:06:46.436 08:18:59 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:06:46.436 08:18:59 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:06:46.436 08:18:59 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:06:46.436 08:18:59 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:06:46.436 08:18:59 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:06:46.436 08:18:59 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:06:46.436 08:18:59 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:46.436 1+0 records in 00:06:46.436 1+0 records out 00:06:46.436 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000223639 s, 18.3 MB/s 00:06:46.436 08:18:59 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:46.436 08:18:59 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:06:46.436 08:18:59 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:46.436 08:18:59 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:06:46.436 08:18:59 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:06:46.436 08:18:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:46.436 08:18:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:46.436 08:18:59 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:46.694 /dev/nbd1 00:06:46.694 08:18:59 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:46.694 08:18:59 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:46.694 08:18:59 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:06:46.694 08:18:59 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:06:46.694 08:18:59 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:06:46.694 08:18:59 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:06:46.694 08:18:59 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:06:46.694 08:18:59 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:06:46.694 08:18:59 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:06:46.694 08:18:59 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:06:46.694 08:18:59 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:46.694 1+0 records in 00:06:46.694 1+0 records out 00:06:46.694 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251623 s, 16.3 MB/s 00:06:46.694 08:18:59 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:46.694 08:18:59 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:06:46.694 08:18:59 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:46.694 08:18:59 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:06:46.694 08:18:59 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:06:46.694 08:18:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:46.694 08:18:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:46.694 08:18:59 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:46.694 08:18:59 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:46.694 08:18:59 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:46.952 { 00:06:46.952 "nbd_device": "/dev/nbd0", 00:06:46.952 "bdev_name": "Malloc0" 00:06:46.952 }, 00:06:46.952 { 00:06:46.952 "nbd_device": "/dev/nbd1", 00:06:46.952 "bdev_name": "Malloc1" 00:06:46.952 } 00:06:46.952 ]' 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:46.952 { 00:06:46.952 "nbd_device": "/dev/nbd0", 00:06:46.952 "bdev_name": "Malloc0" 00:06:46.952 }, 00:06:46.952 { 00:06:46.952 "nbd_device": "/dev/nbd1", 00:06:46.952 "bdev_name": "Malloc1" 00:06:46.952 } 00:06:46.952 ]' 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:46.952 /dev/nbd1' 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:46.952 /dev/nbd1' 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:46.952 256+0 records in 00:06:46.952 256+0 records out 00:06:46.952 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115986 s, 90.4 MB/s 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:46.952 256+0 records in 00:06:46.952 256+0 records out 00:06:46.952 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.019709 s, 53.2 MB/s 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:46.952 08:19:00 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:47.211 256+0 records in 00:06:47.211 256+0 records out 00:06:47.211 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0212791 s, 49.3 MB/s 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:47.211 08:19:00 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:47.471 08:19:00 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:47.471 08:19:00 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:47.471 08:19:00 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:47.471 08:19:00 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:47.471 08:19:00 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:47.471 08:19:00 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:47.471 08:19:00 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:47.471 08:19:00 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:47.471 08:19:00 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:47.471 08:19:00 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:47.471 08:19:00 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:47.730 08:19:00 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:47.730 08:19:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:47.730 08:19:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:47.730 08:19:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:47.730 08:19:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:47.730 08:19:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:47.730 08:19:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:47.730 08:19:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:47.730 08:19:00 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:47.730 08:19:00 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:47.730 08:19:00 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:47.730 08:19:00 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:47.730 08:19:00 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:47.990 08:19:01 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:48.249 [2024-11-17 08:19:01.177936] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:48.249 [2024-11-17 08:19:01.212895] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:48.249 [2024-11-17 08:19:01.212897] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.249 [2024-11-17 08:19:01.252512] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:48.249 [2024-11-17 08:19:01.252554] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:51.537 08:19:04 event.app_repeat -- event/event.sh@38 -- # waitforlisten 982118 /var/tmp/spdk-nbd.sock 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 982118 ']' 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:51.537 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:06:51.537 08:19:04 event.app_repeat -- event/event.sh@39 -- # killprocess 982118 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@950 -- # '[' -z 982118 ']' 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@954 -- # kill -0 982118 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@955 -- # uname 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 982118 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 982118' 00:06:51.537 killing process with pid 982118 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@969 -- # kill 982118 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@974 -- # wait 982118 00:06:51.537 spdk_app_start is called in Round 0. 00:06:51.537 Shutdown signal received, stop current app iteration 00:06:51.537 Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 reinitialization... 00:06:51.537 spdk_app_start is called in Round 1. 00:06:51.537 Shutdown signal received, stop current app iteration 00:06:51.537 Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 reinitialization... 00:06:51.537 spdk_app_start is called in Round 2. 00:06:51.537 Shutdown signal received, stop current app iteration 00:06:51.537 Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 reinitialization... 00:06:51.537 spdk_app_start is called in Round 3. 00:06:51.537 Shutdown signal received, stop current app iteration 00:06:51.537 08:19:04 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:51.537 08:19:04 event.app_repeat -- event/event.sh@42 -- # return 0 00:06:51.537 00:06:51.537 real 0m16.291s 00:06:51.537 user 0m34.995s 00:06:51.537 sys 0m3.202s 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:51.537 08:19:04 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:51.537 ************************************ 00:06:51.537 END TEST app_repeat 00:06:51.537 ************************************ 00:06:51.537 08:19:04 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:51.537 08:19:04 event -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:51.538 08:19:04 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:51.538 08:19:04 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:51.538 08:19:04 event -- common/autotest_common.sh@10 -- # set +x 00:06:51.538 ************************************ 00:06:51.538 START TEST cpu_locks 00:06:51.538 ************************************ 00:06:51.538 08:19:04 event.cpu_locks -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:51.538 * Looking for test storage... 00:06:51.538 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:51.538 08:19:04 event.cpu_locks -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:51.538 08:19:04 event.cpu_locks -- common/autotest_common.sh@1681 -- # lcov --version 00:06:51.538 08:19:04 event.cpu_locks -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:51.538 08:19:04 event.cpu_locks -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@336 -- # IFS=.-: 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@336 -- # read -ra ver1 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@337 -- # IFS=.-: 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@337 -- # read -ra ver2 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@338 -- # local 'op=<' 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@340 -- # ver1_l=2 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@341 -- # ver2_l=1 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@344 -- # case "$op" in 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@345 -- # : 1 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@365 -- # decimal 1 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@353 -- # local d=1 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@355 -- # echo 1 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@365 -- # ver1[v]=1 00:06:51.538 08:19:04 event.cpu_locks -- scripts/common.sh@366 -- # decimal 2 00:06:51.797 08:19:04 event.cpu_locks -- scripts/common.sh@353 -- # local d=2 00:06:51.797 08:19:04 event.cpu_locks -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:51.797 08:19:04 event.cpu_locks -- scripts/common.sh@355 -- # echo 2 00:06:51.797 08:19:04 event.cpu_locks -- scripts/common.sh@366 -- # ver2[v]=2 00:06:51.797 08:19:04 event.cpu_locks -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:51.797 08:19:04 event.cpu_locks -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:51.797 08:19:04 event.cpu_locks -- scripts/common.sh@368 -- # return 0 00:06:51.797 08:19:04 event.cpu_locks -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:51.797 08:19:04 event.cpu_locks -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:51.797 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.797 --rc genhtml_branch_coverage=1 00:06:51.797 --rc genhtml_function_coverage=1 00:06:51.797 --rc genhtml_legend=1 00:06:51.797 --rc geninfo_all_blocks=1 00:06:51.797 --rc geninfo_unexecuted_blocks=1 00:06:51.797 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.797 ' 00:06:51.797 08:19:04 event.cpu_locks -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:51.797 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.797 --rc genhtml_branch_coverage=1 00:06:51.797 --rc genhtml_function_coverage=1 00:06:51.797 --rc genhtml_legend=1 00:06:51.797 --rc geninfo_all_blocks=1 00:06:51.797 --rc geninfo_unexecuted_blocks=1 00:06:51.797 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.797 ' 00:06:51.797 08:19:04 event.cpu_locks -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:51.797 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.797 --rc genhtml_branch_coverage=1 00:06:51.797 --rc genhtml_function_coverage=1 00:06:51.797 --rc genhtml_legend=1 00:06:51.797 --rc geninfo_all_blocks=1 00:06:51.797 --rc geninfo_unexecuted_blocks=1 00:06:51.797 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.797 ' 00:06:51.797 08:19:04 event.cpu_locks -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:51.797 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.797 --rc genhtml_branch_coverage=1 00:06:51.798 --rc genhtml_function_coverage=1 00:06:51.798 --rc genhtml_legend=1 00:06:51.798 --rc geninfo_all_blocks=1 00:06:51.798 --rc geninfo_unexecuted_blocks=1 00:06:51.798 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.798 ' 00:06:51.798 08:19:04 event.cpu_locks -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:51.798 08:19:04 event.cpu_locks -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:51.798 08:19:04 event.cpu_locks -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:51.798 08:19:04 event.cpu_locks -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:51.798 08:19:04 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:51.798 08:19:04 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:51.798 08:19:04 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:06:51.798 ************************************ 00:06:51.798 START TEST default_locks 00:06:51.798 ************************************ 00:06:51.798 08:19:04 event.cpu_locks.default_locks -- common/autotest_common.sh@1125 -- # default_locks 00:06:51.798 08:19:04 event.cpu_locks.default_locks -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=985237 00:06:51.798 08:19:04 event.cpu_locks.default_locks -- event/cpu_locks.sh@47 -- # waitforlisten 985237 00:06:51.798 08:19:04 event.cpu_locks.default_locks -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:51.798 08:19:04 event.cpu_locks.default_locks -- common/autotest_common.sh@831 -- # '[' -z 985237 ']' 00:06:51.798 08:19:04 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:51.798 08:19:04 event.cpu_locks.default_locks -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:51.798 08:19:04 event.cpu_locks.default_locks -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:51.798 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:51.798 08:19:04 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:51.798 08:19:04 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:06:51.798 [2024-11-17 08:19:04.744740] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:51.798 [2024-11-17 08:19:04.744806] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid985237 ] 00:06:51.798 [2024-11-17 08:19:04.810258] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.798 [2024-11-17 08:19:04.847600] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.057 08:19:05 event.cpu_locks.default_locks -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:52.057 08:19:05 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # return 0 00:06:52.057 08:19:05 event.cpu_locks.default_locks -- event/cpu_locks.sh@49 -- # locks_exist 985237 00:06:52.057 08:19:05 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # lslocks -p 985237 00:06:52.057 08:19:05 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:52.993 lslocks: write error 00:06:52.993 08:19:05 event.cpu_locks.default_locks -- event/cpu_locks.sh@50 -- # killprocess 985237 00:06:52.993 08:19:05 event.cpu_locks.default_locks -- common/autotest_common.sh@950 -- # '[' -z 985237 ']' 00:06:52.993 08:19:05 event.cpu_locks.default_locks -- common/autotest_common.sh@954 -- # kill -0 985237 00:06:52.993 08:19:05 event.cpu_locks.default_locks -- common/autotest_common.sh@955 -- # uname 00:06:52.993 08:19:05 event.cpu_locks.default_locks -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:52.993 08:19:05 event.cpu_locks.default_locks -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 985237 00:06:52.993 08:19:05 event.cpu_locks.default_locks -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:52.993 08:19:05 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:52.993 08:19:05 event.cpu_locks.default_locks -- common/autotest_common.sh@968 -- # echo 'killing process with pid 985237' 00:06:52.993 killing process with pid 985237 00:06:52.993 08:19:05 event.cpu_locks.default_locks -- common/autotest_common.sh@969 -- # kill 985237 00:06:52.993 08:19:05 event.cpu_locks.default_locks -- common/autotest_common.sh@974 -- # wait 985237 00:06:53.252 08:19:06 event.cpu_locks.default_locks -- event/cpu_locks.sh@52 -- # NOT waitforlisten 985237 00:06:53.252 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@650 -- # local es=0 00:06:53.252 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 985237 00:06:53.252 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:53.252 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:53.252 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:53.252 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:53.252 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@653 -- # waitforlisten 985237 00:06:53.252 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@831 -- # '[' -z 985237 ']' 00:06:53.252 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:53.252 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:53.252 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:53.252 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:53.252 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:53.252 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:06:53.252 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 846: kill: (985237) - No such process 00:06:53.252 ERROR: process (pid: 985237) is no longer running 00:06:53.252 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:53.252 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # return 1 00:06:53.253 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@653 -- # es=1 00:06:53.253 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:53.253 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:53.253 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:53.253 08:19:06 event.cpu_locks.default_locks -- event/cpu_locks.sh@54 -- # no_locks 00:06:53.253 08:19:06 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:53.253 08:19:06 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # local lock_files 00:06:53.253 08:19:06 event.cpu_locks.default_locks -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:53.253 00:06:53.253 real 0m1.472s 00:06:53.253 user 0m1.477s 00:06:53.253 sys 0m0.708s 00:06:53.253 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:53.253 08:19:06 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:06:53.253 ************************************ 00:06:53.253 END TEST default_locks 00:06:53.253 ************************************ 00:06:53.253 08:19:06 event.cpu_locks -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:53.253 08:19:06 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:53.253 08:19:06 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:53.253 08:19:06 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:06:53.253 ************************************ 00:06:53.253 START TEST default_locks_via_rpc 00:06:53.253 ************************************ 00:06:53.253 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1125 -- # default_locks_via_rpc 00:06:53.253 08:19:06 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=985543 00:06:53.253 08:19:06 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@63 -- # waitforlisten 985543 00:06:53.253 08:19:06 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:53.253 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 985543 ']' 00:06:53.253 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:53.253 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:53.253 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:53.253 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:53.253 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:53.253 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:53.253 [2024-11-17 08:19:06.298170] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:53.253 [2024-11-17 08:19:06.298251] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid985543 ] 00:06:53.253 [2024-11-17 08:19:06.366026] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.512 [2024-11-17 08:19:06.405365] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.512 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:53.512 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:06:53.512 08:19:06 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:53.512 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:53.512 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:53.512 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:53.512 08:19:06 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@67 -- # no_locks 00:06:53.512 08:19:06 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:53.512 08:19:06 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # local lock_files 00:06:53.512 08:19:06 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:53.512 08:19:06 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:53.512 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:53.512 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:53.512 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:53.512 08:19:06 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@71 -- # locks_exist 985543 00:06:53.512 08:19:06 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # lslocks -p 985543 00:06:53.512 08:19:06 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:54.080 08:19:06 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@73 -- # killprocess 985543 00:06:54.080 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@950 -- # '[' -z 985543 ']' 00:06:54.080 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@954 -- # kill -0 985543 00:06:54.080 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@955 -- # uname 00:06:54.080 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:54.080 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 985543 00:06:54.080 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:54.080 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:54.080 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 985543' 00:06:54.080 killing process with pid 985543 00:06:54.080 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@969 -- # kill 985543 00:06:54.080 08:19:06 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@974 -- # wait 985543 00:06:54.340 00:06:54.340 real 0m1.035s 00:06:54.340 user 0m0.996s 00:06:54.340 sys 0m0.499s 00:06:54.340 08:19:07 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:54.340 08:19:07 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:54.340 ************************************ 00:06:54.340 END TEST default_locks_via_rpc 00:06:54.340 ************************************ 00:06:54.340 08:19:07 event.cpu_locks -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:54.340 08:19:07 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:54.340 08:19:07 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:54.340 08:19:07 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:06:54.340 ************************************ 00:06:54.340 START TEST non_locking_app_on_locked_coremask 00:06:54.340 ************************************ 00:06:54.340 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1125 -- # non_locking_app_on_locked_coremask 00:06:54.340 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=985622 00:06:54.340 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@81 -- # waitforlisten 985622 /var/tmp/spdk.sock 00:06:54.340 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 985622 ']' 00:06:54.340 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:54.340 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:54.340 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:54.340 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:54.340 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:54.340 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:54.340 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:06:54.340 [2024-11-17 08:19:07.411890] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:54.340 [2024-11-17 08:19:07.411951] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid985622 ] 00:06:54.599 [2024-11-17 08:19:07.479740] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.599 [2024-11-17 08:19:07.518110] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.599 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:54.599 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 0 00:06:54.599 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=985792 00:06:54.599 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@85 -- # waitforlisten 985792 /var/tmp/spdk2.sock 00:06:54.599 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:54.599 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 985792 ']' 00:06:54.599 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:54.599 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:54.599 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:54.599 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:54.600 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:54.600 08:19:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:06:54.600 [2024-11-17 08:19:07.737293] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:54.600 [2024-11-17 08:19:07.737362] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid985792 ] 00:06:54.859 [2024-11-17 08:19:07.830791] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:54.859 [2024-11-17 08:19:07.830825] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.859 [2024-11-17 08:19:07.904521] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.798 08:19:08 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:55.798 08:19:08 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 0 00:06:55.798 08:19:08 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@87 -- # locks_exist 985622 00:06:55.798 08:19:08 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 985622 00:06:55.798 08:19:08 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:56.366 lslocks: write error 00:06:56.366 08:19:09 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@89 -- # killprocess 985622 00:06:56.366 08:19:09 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@950 -- # '[' -z 985622 ']' 00:06:56.366 08:19:09 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # kill -0 985622 00:06:56.366 08:19:09 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # uname 00:06:56.366 08:19:09 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:56.366 08:19:09 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 985622 00:06:56.625 08:19:09 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:56.625 08:19:09 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:56.625 08:19:09 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 985622' 00:06:56.625 killing process with pid 985622 00:06:56.625 08:19:09 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@969 -- # kill 985622 00:06:56.625 08:19:09 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@974 -- # wait 985622 00:06:57.194 08:19:10 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@90 -- # killprocess 985792 00:06:57.194 08:19:10 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@950 -- # '[' -z 985792 ']' 00:06:57.194 08:19:10 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # kill -0 985792 00:06:57.194 08:19:10 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # uname 00:06:57.194 08:19:10 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:57.194 08:19:10 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 985792 00:06:57.194 08:19:10 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:57.194 08:19:10 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:57.194 08:19:10 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 985792' 00:06:57.194 killing process with pid 985792 00:06:57.194 08:19:10 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@969 -- # kill 985792 00:06:57.194 08:19:10 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@974 -- # wait 985792 00:06:57.454 00:06:57.454 real 0m3.114s 00:06:57.454 user 0m3.233s 00:06:57.454 sys 0m1.154s 00:06:57.454 08:19:10 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:57.454 08:19:10 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:06:57.454 ************************************ 00:06:57.454 END TEST non_locking_app_on_locked_coremask 00:06:57.454 ************************************ 00:06:57.454 08:19:10 event.cpu_locks -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:06:57.454 08:19:10 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:57.454 08:19:10 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:57.454 08:19:10 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:06:57.454 ************************************ 00:06:57.454 START TEST locking_app_on_unlocked_coremask 00:06:57.454 ************************************ 00:06:57.454 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1125 -- # locking_app_on_unlocked_coremask 00:06:57.454 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:06:57.454 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=986244 00:06:57.454 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@99 -- # waitforlisten 986244 /var/tmp/spdk.sock 00:06:57.454 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@831 -- # '[' -z 986244 ']' 00:06:57.454 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:57.454 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:57.454 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:57.454 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:57.454 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:57.454 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:06:57.714 [2024-11-17 08:19:10.595206] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:57.714 [2024-11-17 08:19:10.595261] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid986244 ] 00:06:57.714 [2024-11-17 08:19:10.661532] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:57.714 [2024-11-17 08:19:10.661558] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.714 [2024-11-17 08:19:10.699025] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.973 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:57.973 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # return 0 00:06:57.973 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=986412 00:06:57.973 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@103 -- # waitforlisten 986412 /var/tmp/spdk2.sock 00:06:57.973 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:57.973 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@831 -- # '[' -z 986412 ']' 00:06:57.973 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:57.974 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:57.974 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:57.974 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:57.974 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:57.974 08:19:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:06:57.974 [2024-11-17 08:19:10.915288] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:06:57.974 [2024-11-17 08:19:10.915347] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid986412 ] 00:06:57.974 [2024-11-17 08:19:11.004216] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.974 [2024-11-17 08:19:11.081652] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.911 08:19:11 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:58.911 08:19:11 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # return 0 00:06:58.911 08:19:11 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@105 -- # locks_exist 986412 00:06:58.911 08:19:11 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 986412 00:06:58.911 08:19:11 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:59.881 lslocks: write error 00:06:59.881 08:19:12 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@107 -- # killprocess 986244 00:06:59.881 08:19:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@950 -- # '[' -z 986244 ']' 00:06:59.881 08:19:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # kill -0 986244 00:06:59.881 08:19:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # uname 00:06:59.881 08:19:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:59.881 08:19:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 986244 00:06:59.881 08:19:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:59.881 08:19:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:59.881 08:19:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 986244' 00:06:59.881 killing process with pid 986244 00:06:59.881 08:19:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@969 -- # kill 986244 00:06:59.881 08:19:12 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@974 -- # wait 986244 00:07:00.451 08:19:13 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@108 -- # killprocess 986412 00:07:00.451 08:19:13 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@950 -- # '[' -z 986412 ']' 00:07:00.451 08:19:13 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # kill -0 986412 00:07:00.451 08:19:13 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:00.451 08:19:13 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:00.451 08:19:13 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 986412 00:07:00.451 08:19:13 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:00.451 08:19:13 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:00.451 08:19:13 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 986412' 00:07:00.451 killing process with pid 986412 00:07:00.451 08:19:13 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@969 -- # kill 986412 00:07:00.451 08:19:13 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@974 -- # wait 986412 00:07:00.710 00:07:00.710 real 0m3.167s 00:07:00.710 user 0m3.328s 00:07:00.710 sys 0m1.172s 00:07:00.710 08:19:13 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:00.710 08:19:13 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:00.710 ************************************ 00:07:00.710 END TEST locking_app_on_unlocked_coremask 00:07:00.710 ************************************ 00:07:00.711 08:19:13 event.cpu_locks -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:07:00.711 08:19:13 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:00.711 08:19:13 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:00.711 08:19:13 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:00.711 ************************************ 00:07:00.711 START TEST locking_app_on_locked_coremask 00:07:00.711 ************************************ 00:07:00.711 08:19:13 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1125 -- # locking_app_on_locked_coremask 00:07:00.711 08:19:13 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=986905 00:07:00.711 08:19:13 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@116 -- # waitforlisten 986905 /var/tmp/spdk.sock 00:07:00.711 08:19:13 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:00.711 08:19:13 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 986905 ']' 00:07:00.711 08:19:13 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:00.711 08:19:13 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:00.711 08:19:13 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:00.711 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:00.711 08:19:13 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:00.711 08:19:13 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:00.970 [2024-11-17 08:19:13.852461] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:00.970 [2024-11-17 08:19:13.852518] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid986905 ] 00:07:00.970 [2024-11-17 08:19:13.920094] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.970 [2024-11-17 08:19:13.959026] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.230 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:01.230 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:01.230 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:01.230 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=987028 00:07:01.230 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@120 -- # NOT waitforlisten 987028 /var/tmp/spdk2.sock 00:07:01.230 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@650 -- # local es=0 00:07:01.230 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 987028 /var/tmp/spdk2.sock 00:07:01.230 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:07:01.230 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:01.230 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:07:01.230 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:01.230 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@653 -- # waitforlisten 987028 /var/tmp/spdk2.sock 00:07:01.230 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 987028 ']' 00:07:01.230 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:01.230 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:01.230 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:01.230 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:01.230 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:01.230 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:01.230 [2024-11-17 08:19:14.165789] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:01.230 [2024-11-17 08:19:14.165858] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid987028 ] 00:07:01.230 [2024-11-17 08:19:14.250320] app.c: 780:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 986905 has claimed it. 00:07:01.230 [2024-11-17 08:19:14.250363] app.c: 910:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:01.799 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 846: kill: (987028) - No such process 00:07:01.799 ERROR: process (pid: 987028) is no longer running 00:07:01.799 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:01.799 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 1 00:07:01.799 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@653 -- # es=1 00:07:01.799 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:01.799 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:01.799 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:01.799 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@122 -- # locks_exist 986905 00:07:01.799 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 986905 00:07:01.799 08:19:14 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:02.367 lslocks: write error 00:07:02.367 08:19:15 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@124 -- # killprocess 986905 00:07:02.367 08:19:15 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@950 -- # '[' -z 986905 ']' 00:07:02.367 08:19:15 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # kill -0 986905 00:07:02.367 08:19:15 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:02.367 08:19:15 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:02.367 08:19:15 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 986905 00:07:02.627 08:19:15 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:02.627 08:19:15 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:02.627 08:19:15 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 986905' 00:07:02.627 killing process with pid 986905 00:07:02.627 08:19:15 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@969 -- # kill 986905 00:07:02.627 08:19:15 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@974 -- # wait 986905 00:07:02.887 00:07:02.887 real 0m2.024s 00:07:02.887 user 0m2.160s 00:07:02.887 sys 0m0.750s 00:07:02.887 08:19:15 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:02.887 08:19:15 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:02.887 ************************************ 00:07:02.887 END TEST locking_app_on_locked_coremask 00:07:02.887 ************************************ 00:07:02.887 08:19:15 event.cpu_locks -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:07:02.887 08:19:15 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:02.887 08:19:15 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:02.887 08:19:15 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:02.887 ************************************ 00:07:02.887 START TEST locking_overlapped_coremask 00:07:02.887 ************************************ 00:07:02.887 08:19:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1125 -- # locking_overlapped_coremask 00:07:02.887 08:19:15 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=987331 00:07:02.887 08:19:15 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@133 -- # waitforlisten 987331 /var/tmp/spdk.sock 00:07:02.887 08:19:15 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:07:02.887 08:19:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@831 -- # '[' -z 987331 ']' 00:07:02.887 08:19:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:02.887 08:19:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:02.887 08:19:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:02.887 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:02.887 08:19:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:02.887 08:19:15 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:02.887 [2024-11-17 08:19:15.956171] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:02.887 [2024-11-17 08:19:15.956247] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid987331 ] 00:07:02.887 [2024-11-17 08:19:16.024321] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:03.147 [2024-11-17 08:19:16.060945] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:03.147 [2024-11-17 08:19:16.061039] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:07:03.147 [2024-11-17 08:19:16.061042] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.147 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:03.147 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:03.147 08:19:16 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=987340 00:07:03.147 08:19:16 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@137 -- # NOT waitforlisten 987340 /var/tmp/spdk2.sock 00:07:03.147 08:19:16 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:07:03.147 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@650 -- # local es=0 00:07:03.147 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 987340 /var/tmp/spdk2.sock 00:07:03.147 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:07:03.147 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:03.147 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:07:03.147 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:03.147 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@653 -- # waitforlisten 987340 /var/tmp/spdk2.sock 00:07:03.147 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@831 -- # '[' -z 987340 ']' 00:07:03.147 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:03.147 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:03.147 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:03.147 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:03.147 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:03.147 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:03.147 [2024-11-17 08:19:16.277595] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:03.147 [2024-11-17 08:19:16.277671] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid987340 ] 00:07:03.407 [2024-11-17 08:19:16.372262] app.c: 780:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 987331 has claimed it. 00:07:03.407 [2024-11-17 08:19:16.372303] app.c: 910:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:03.976 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 846: kill: (987340) - No such process 00:07:03.976 ERROR: process (pid: 987340) is no longer running 00:07:03.976 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:03.976 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # return 1 00:07:03.976 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@653 -- # es=1 00:07:03.976 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:03.976 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:03.976 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:03.976 08:19:16 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:07:03.976 08:19:16 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:03.976 08:19:16 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:03.976 08:19:16 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:03.976 08:19:16 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@141 -- # killprocess 987331 00:07:03.976 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@950 -- # '[' -z 987331 ']' 00:07:03.976 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@954 -- # kill -0 987331 00:07:03.976 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@955 -- # uname 00:07:03.976 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:03.976 08:19:16 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 987331 00:07:03.976 08:19:17 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:03.976 08:19:17 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:03.976 08:19:17 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 987331' 00:07:03.976 killing process with pid 987331 00:07:03.976 08:19:17 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@969 -- # kill 987331 00:07:03.976 08:19:17 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@974 -- # wait 987331 00:07:04.235 00:07:04.235 real 0m1.391s 00:07:04.235 user 0m3.833s 00:07:04.235 sys 0m0.424s 00:07:04.235 08:19:17 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:04.235 08:19:17 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:04.235 ************************************ 00:07:04.235 END TEST locking_overlapped_coremask 00:07:04.235 ************************************ 00:07:04.235 08:19:17 event.cpu_locks -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:07:04.235 08:19:17 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:04.235 08:19:17 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:04.235 08:19:17 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:04.495 ************************************ 00:07:04.495 START TEST locking_overlapped_coremask_via_rpc 00:07:04.495 ************************************ 00:07:04.495 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1125 -- # locking_overlapped_coremask_via_rpc 00:07:04.495 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=987626 00:07:04.495 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@149 -- # waitforlisten 987626 /var/tmp/spdk.sock 00:07:04.495 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:07:04.495 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 987626 ']' 00:07:04.495 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:04.495 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:04.495 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:04.495 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:04.495 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:04.495 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:04.495 [2024-11-17 08:19:17.429057] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:04.495 [2024-11-17 08:19:17.429115] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid987626 ] 00:07:04.496 [2024-11-17 08:19:17.495770] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:04.496 [2024-11-17 08:19:17.495798] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:04.496 [2024-11-17 08:19:17.535131] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:04.496 [2024-11-17 08:19:17.535226] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.496 [2024-11-17 08:19:17.535227] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:07:04.756 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:04.756 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:04.756 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=987639 00:07:04.756 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@153 -- # waitforlisten 987639 /var/tmp/spdk2.sock 00:07:04.756 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:07:04.756 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 987639 ']' 00:07:04.756 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:04.756 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:04.756 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:04.756 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:04.756 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:04.756 08:19:17 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:04.756 [2024-11-17 08:19:17.758632] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:04.756 [2024-11-17 08:19:17.758719] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid987639 ] 00:07:04.756 [2024-11-17 08:19:17.853104] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:04.756 [2024-11-17 08:19:17.853137] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:05.015 [2024-11-17 08:19:17.932825] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:07:05.015 [2024-11-17 08:19:17.932945] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:07:05.015 [2024-11-17 08:19:17.932947] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 4 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@650 -- # local es=0 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@653 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:05.583 [2024-11-17 08:19:18.636755] app.c: 780:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 987626 has claimed it. 00:07:05.583 request: 00:07:05.583 { 00:07:05.583 "method": "framework_enable_cpumask_locks", 00:07:05.583 "req_id": 1 00:07:05.583 } 00:07:05.583 Got JSON-RPC error response 00:07:05.583 response: 00:07:05.583 { 00:07:05.583 "code": -32603, 00:07:05.583 "message": "Failed to claim CPU core: 2" 00:07:05.583 } 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@653 -- # es=1 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@158 -- # waitforlisten 987626 /var/tmp/spdk.sock 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 987626 ']' 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:05.583 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:05.583 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:05.842 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:05.842 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:05.842 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@159 -- # waitforlisten 987639 /var/tmp/spdk2.sock 00:07:05.842 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 987639 ']' 00:07:05.842 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:05.842 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:05.842 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:05.842 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:05.842 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:05.842 08:19:18 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:06.101 08:19:19 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:06.101 08:19:19 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:06.101 08:19:19 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:07:06.101 08:19:19 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:06.101 08:19:19 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:06.101 08:19:19 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:06.101 00:07:06.101 real 0m1.659s 00:07:06.101 user 0m0.767s 00:07:06.101 sys 0m0.187s 00:07:06.101 08:19:19 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:06.101 08:19:19 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:06.101 ************************************ 00:07:06.101 END TEST locking_overlapped_coremask_via_rpc 00:07:06.101 ************************************ 00:07:06.101 08:19:19 event.cpu_locks -- event/cpu_locks.sh@174 -- # cleanup 00:07:06.101 08:19:19 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 987626 ]] 00:07:06.101 08:19:19 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 987626 00:07:06.101 08:19:19 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 987626 ']' 00:07:06.101 08:19:19 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 987626 00:07:06.101 08:19:19 event.cpu_locks -- common/autotest_common.sh@955 -- # uname 00:07:06.101 08:19:19 event.cpu_locks -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:06.101 08:19:19 event.cpu_locks -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 987626 00:07:06.101 08:19:19 event.cpu_locks -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:06.101 08:19:19 event.cpu_locks -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:06.101 08:19:19 event.cpu_locks -- common/autotest_common.sh@968 -- # echo 'killing process with pid 987626' 00:07:06.101 killing process with pid 987626 00:07:06.101 08:19:19 event.cpu_locks -- common/autotest_common.sh@969 -- # kill 987626 00:07:06.101 08:19:19 event.cpu_locks -- common/autotest_common.sh@974 -- # wait 987626 00:07:06.360 08:19:19 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 987639 ]] 00:07:06.360 08:19:19 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 987639 00:07:06.360 08:19:19 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 987639 ']' 00:07:06.360 08:19:19 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 987639 00:07:06.360 08:19:19 event.cpu_locks -- common/autotest_common.sh@955 -- # uname 00:07:06.619 08:19:19 event.cpu_locks -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:06.619 08:19:19 event.cpu_locks -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 987639 00:07:06.619 08:19:19 event.cpu_locks -- common/autotest_common.sh@956 -- # process_name=reactor_2 00:07:06.619 08:19:19 event.cpu_locks -- common/autotest_common.sh@960 -- # '[' reactor_2 = sudo ']' 00:07:06.619 08:19:19 event.cpu_locks -- common/autotest_common.sh@968 -- # echo 'killing process with pid 987639' 00:07:06.619 killing process with pid 987639 00:07:06.619 08:19:19 event.cpu_locks -- common/autotest_common.sh@969 -- # kill 987639 00:07:06.619 08:19:19 event.cpu_locks -- common/autotest_common.sh@974 -- # wait 987639 00:07:06.878 08:19:19 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:06.878 08:19:19 event.cpu_locks -- event/cpu_locks.sh@1 -- # cleanup 00:07:06.878 08:19:19 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 987626 ]] 00:07:06.878 08:19:19 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 987626 00:07:06.878 08:19:19 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 987626 ']' 00:07:06.878 08:19:19 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 987626 00:07:06.878 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 954: kill: (987626) - No such process 00:07:06.878 08:19:19 event.cpu_locks -- common/autotest_common.sh@977 -- # echo 'Process with pid 987626 is not found' 00:07:06.878 Process with pid 987626 is not found 00:07:06.878 08:19:19 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 987639 ]] 00:07:06.878 08:19:19 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 987639 00:07:06.878 08:19:19 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 987639 ']' 00:07:06.878 08:19:19 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 987639 00:07:06.878 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 954: kill: (987639) - No such process 00:07:06.878 08:19:19 event.cpu_locks -- common/autotest_common.sh@977 -- # echo 'Process with pid 987639 is not found' 00:07:06.878 Process with pid 987639 is not found 00:07:06.878 08:19:19 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:06.878 00:07:06.878 real 0m15.388s 00:07:06.878 user 0m25.662s 00:07:06.878 sys 0m6.018s 00:07:06.878 08:19:19 event.cpu_locks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:06.878 08:19:19 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:06.878 ************************************ 00:07:06.878 END TEST cpu_locks 00:07:06.878 ************************************ 00:07:06.878 00:07:06.878 real 0m40.295s 00:07:06.878 user 1m14.698s 00:07:06.878 sys 0m10.393s 00:07:06.878 08:19:19 event -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:06.879 08:19:19 event -- common/autotest_common.sh@10 -- # set +x 00:07:06.879 ************************************ 00:07:06.879 END TEST event 00:07:06.879 ************************************ 00:07:06.879 08:19:19 -- spdk/autotest.sh@169 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:06.879 08:19:19 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:06.879 08:19:19 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:06.879 08:19:19 -- common/autotest_common.sh@10 -- # set +x 00:07:06.879 ************************************ 00:07:06.879 START TEST thread 00:07:06.879 ************************************ 00:07:06.879 08:19:19 thread -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:07.138 * Looking for test storage... 00:07:07.138 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:07:07.138 08:19:20 thread -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:07.138 08:19:20 thread -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:07.138 08:19:20 thread -- common/autotest_common.sh@1681 -- # lcov --version 00:07:07.138 08:19:20 thread -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:07.138 08:19:20 thread -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:07.138 08:19:20 thread -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:07.138 08:19:20 thread -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:07.138 08:19:20 thread -- scripts/common.sh@336 -- # IFS=.-: 00:07:07.138 08:19:20 thread -- scripts/common.sh@336 -- # read -ra ver1 00:07:07.138 08:19:20 thread -- scripts/common.sh@337 -- # IFS=.-: 00:07:07.138 08:19:20 thread -- scripts/common.sh@337 -- # read -ra ver2 00:07:07.138 08:19:20 thread -- scripts/common.sh@338 -- # local 'op=<' 00:07:07.138 08:19:20 thread -- scripts/common.sh@340 -- # ver1_l=2 00:07:07.138 08:19:20 thread -- scripts/common.sh@341 -- # ver2_l=1 00:07:07.138 08:19:20 thread -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:07.138 08:19:20 thread -- scripts/common.sh@344 -- # case "$op" in 00:07:07.138 08:19:20 thread -- scripts/common.sh@345 -- # : 1 00:07:07.138 08:19:20 thread -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:07.138 08:19:20 thread -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:07.138 08:19:20 thread -- scripts/common.sh@365 -- # decimal 1 00:07:07.138 08:19:20 thread -- scripts/common.sh@353 -- # local d=1 00:07:07.138 08:19:20 thread -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:07.138 08:19:20 thread -- scripts/common.sh@355 -- # echo 1 00:07:07.138 08:19:20 thread -- scripts/common.sh@365 -- # ver1[v]=1 00:07:07.138 08:19:20 thread -- scripts/common.sh@366 -- # decimal 2 00:07:07.138 08:19:20 thread -- scripts/common.sh@353 -- # local d=2 00:07:07.138 08:19:20 thread -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:07.138 08:19:20 thread -- scripts/common.sh@355 -- # echo 2 00:07:07.138 08:19:20 thread -- scripts/common.sh@366 -- # ver2[v]=2 00:07:07.138 08:19:20 thread -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:07.138 08:19:20 thread -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:07.138 08:19:20 thread -- scripts/common.sh@368 -- # return 0 00:07:07.138 08:19:20 thread -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:07.138 08:19:20 thread -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:07.138 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.138 --rc genhtml_branch_coverage=1 00:07:07.138 --rc genhtml_function_coverage=1 00:07:07.138 --rc genhtml_legend=1 00:07:07.138 --rc geninfo_all_blocks=1 00:07:07.138 --rc geninfo_unexecuted_blocks=1 00:07:07.138 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.138 ' 00:07:07.138 08:19:20 thread -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:07.138 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.138 --rc genhtml_branch_coverage=1 00:07:07.138 --rc genhtml_function_coverage=1 00:07:07.138 --rc genhtml_legend=1 00:07:07.138 --rc geninfo_all_blocks=1 00:07:07.138 --rc geninfo_unexecuted_blocks=1 00:07:07.138 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.138 ' 00:07:07.138 08:19:20 thread -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:07.138 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.138 --rc genhtml_branch_coverage=1 00:07:07.138 --rc genhtml_function_coverage=1 00:07:07.138 --rc genhtml_legend=1 00:07:07.138 --rc geninfo_all_blocks=1 00:07:07.138 --rc geninfo_unexecuted_blocks=1 00:07:07.138 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.138 ' 00:07:07.138 08:19:20 thread -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:07.138 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:07.138 --rc genhtml_branch_coverage=1 00:07:07.138 --rc genhtml_function_coverage=1 00:07:07.138 --rc genhtml_legend=1 00:07:07.138 --rc geninfo_all_blocks=1 00:07:07.138 --rc geninfo_unexecuted_blocks=1 00:07:07.138 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:07.138 ' 00:07:07.138 08:19:20 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:07.138 08:19:20 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:07:07.138 08:19:20 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:07.138 08:19:20 thread -- common/autotest_common.sh@10 -- # set +x 00:07:07.138 ************************************ 00:07:07.138 START TEST thread_poller_perf 00:07:07.138 ************************************ 00:07:07.138 08:19:20 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:07.138 [2024-11-17 08:19:20.212069] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:07.138 [2024-11-17 08:19:20.212125] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid988274 ] 00:07:07.138 [2024-11-17 08:19:20.271715] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.398 [2024-11-17 08:19:20.310260] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.398 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:08.334 [2024-11-17T07:19:21.473Z] ====================================== 00:07:08.334 [2024-11-17T07:19:21.473Z] busy:2504830966 (cyc) 00:07:08.334 [2024-11-17T07:19:21.473Z] total_run_count: 816000 00:07:08.334 [2024-11-17T07:19:21.473Z] tsc_hz: 2500000000 (cyc) 00:07:08.334 [2024-11-17T07:19:21.473Z] ====================================== 00:07:08.334 [2024-11-17T07:19:21.473Z] poller_cost: 3069 (cyc), 1227 (nsec) 00:07:08.335 00:07:08.335 real 0m1.168s 00:07:08.335 user 0m1.085s 00:07:08.335 sys 0m0.079s 00:07:08.335 08:19:21 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:08.335 08:19:21 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:08.335 ************************************ 00:07:08.335 END TEST thread_poller_perf 00:07:08.335 ************************************ 00:07:08.335 08:19:21 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:08.335 08:19:21 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:07:08.335 08:19:21 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:08.335 08:19:21 thread -- common/autotest_common.sh@10 -- # set +x 00:07:08.335 ************************************ 00:07:08.335 START TEST thread_poller_perf 00:07:08.335 ************************************ 00:07:08.335 08:19:21 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:08.335 [2024-11-17 08:19:21.467337] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:08.335 [2024-11-17 08:19:21.467435] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid988428 ] 00:07:08.594 [2024-11-17 08:19:21.538321] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.594 [2024-11-17 08:19:21.575812] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.594 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:09.532 [2024-11-17T07:19:22.671Z] ====================================== 00:07:09.532 [2024-11-17T07:19:22.671Z] busy:2501456108 (cyc) 00:07:09.532 [2024-11-17T07:19:22.671Z] total_run_count: 13247000 00:07:09.532 [2024-11-17T07:19:22.671Z] tsc_hz: 2500000000 (cyc) 00:07:09.532 [2024-11-17T07:19:22.671Z] ====================================== 00:07:09.532 [2024-11-17T07:19:22.671Z] poller_cost: 188 (cyc), 75 (nsec) 00:07:09.532 00:07:09.532 real 0m1.183s 00:07:09.532 user 0m1.098s 00:07:09.532 sys 0m0.082s 00:07:09.532 08:19:22 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:09.532 08:19:22 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:09.532 ************************************ 00:07:09.532 END TEST thread_poller_perf 00:07:09.532 ************************************ 00:07:09.792 08:19:22 thread -- thread/thread.sh@17 -- # [[ n != \y ]] 00:07:09.792 08:19:22 thread -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:09.792 08:19:22 thread -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:09.792 08:19:22 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:09.792 08:19:22 thread -- common/autotest_common.sh@10 -- # set +x 00:07:09.792 ************************************ 00:07:09.792 START TEST thread_spdk_lock 00:07:09.792 ************************************ 00:07:09.792 08:19:22 thread.thread_spdk_lock -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:09.792 [2024-11-17 08:19:22.727620] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:09.792 [2024-11-17 08:19:22.727708] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid988593 ] 00:07:09.792 [2024-11-17 08:19:22.799681] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:09.792 [2024-11-17 08:19:22.838233] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:09.792 [2024-11-17 08:19:22.838234] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.362 [2024-11-17 08:19:23.317006] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 967:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:10.362 [2024-11-17 08:19:23.317044] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3080:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:07:10.362 [2024-11-17 08:19:23.317055] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3035:sspin_stacks_print: *ERROR*: spinlock 0x134f900 00:07:10.362 [2024-11-17 08:19:23.317817] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 862:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:10.362 [2024-11-17 08:19:23.317921] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1028:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:10.362 [2024-11-17 08:19:23.317940] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 862:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:10.362 Starting test contend 00:07:10.362 Worker Delay Wait us Hold us Total us 00:07:10.362 0 3 166728 180798 347526 00:07:10.362 1 5 82773 282793 365567 00:07:10.362 PASS test contend 00:07:10.362 Starting test hold_by_poller 00:07:10.362 PASS test hold_by_poller 00:07:10.362 Starting test hold_by_message 00:07:10.362 PASS test hold_by_message 00:07:10.362 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:07:10.362 100014 assertions passed 00:07:10.362 0 assertions failed 00:07:10.362 00:07:10.362 real 0m0.660s 00:07:10.362 user 0m1.042s 00:07:10.362 sys 0m0.093s 00:07:10.362 08:19:23 thread.thread_spdk_lock -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:10.362 08:19:23 thread.thread_spdk_lock -- common/autotest_common.sh@10 -- # set +x 00:07:10.362 ************************************ 00:07:10.362 END TEST thread_spdk_lock 00:07:10.362 ************************************ 00:07:10.362 00:07:10.362 real 0m3.412s 00:07:10.362 user 0m3.403s 00:07:10.362 sys 0m0.501s 00:07:10.362 08:19:23 thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:10.362 08:19:23 thread -- common/autotest_common.sh@10 -- # set +x 00:07:10.362 ************************************ 00:07:10.362 END TEST thread 00:07:10.362 ************************************ 00:07:10.362 08:19:23 -- spdk/autotest.sh@171 -- # [[ 0 -eq 1 ]] 00:07:10.362 08:19:23 -- spdk/autotest.sh@176 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:10.362 08:19:23 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:10.362 08:19:23 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:10.362 08:19:23 -- common/autotest_common.sh@10 -- # set +x 00:07:10.362 ************************************ 00:07:10.362 START TEST app_cmdline 00:07:10.362 ************************************ 00:07:10.362 08:19:23 app_cmdline -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:10.622 * Looking for test storage... 00:07:10.622 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:10.622 08:19:23 app_cmdline -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:10.622 08:19:23 app_cmdline -- common/autotest_common.sh@1681 -- # lcov --version 00:07:10.622 08:19:23 app_cmdline -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:10.622 08:19:23 app_cmdline -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@336 -- # IFS=.-: 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@336 -- # read -ra ver1 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@337 -- # IFS=.-: 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@337 -- # read -ra ver2 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@338 -- # local 'op=<' 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@340 -- # ver1_l=2 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@341 -- # ver2_l=1 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@344 -- # case "$op" in 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@345 -- # : 1 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@365 -- # decimal 1 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@353 -- # local d=1 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@355 -- # echo 1 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@365 -- # ver1[v]=1 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@366 -- # decimal 2 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@353 -- # local d=2 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@355 -- # echo 2 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@366 -- # ver2[v]=2 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:10.622 08:19:23 app_cmdline -- scripts/common.sh@368 -- # return 0 00:07:10.622 08:19:23 app_cmdline -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:10.622 08:19:23 app_cmdline -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:10.622 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.622 --rc genhtml_branch_coverage=1 00:07:10.622 --rc genhtml_function_coverage=1 00:07:10.622 --rc genhtml_legend=1 00:07:10.622 --rc geninfo_all_blocks=1 00:07:10.622 --rc geninfo_unexecuted_blocks=1 00:07:10.622 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.622 ' 00:07:10.622 08:19:23 app_cmdline -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:10.622 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.622 --rc genhtml_branch_coverage=1 00:07:10.622 --rc genhtml_function_coverage=1 00:07:10.622 --rc genhtml_legend=1 00:07:10.622 --rc geninfo_all_blocks=1 00:07:10.622 --rc geninfo_unexecuted_blocks=1 00:07:10.622 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.622 ' 00:07:10.622 08:19:23 app_cmdline -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:10.622 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.622 --rc genhtml_branch_coverage=1 00:07:10.622 --rc genhtml_function_coverage=1 00:07:10.622 --rc genhtml_legend=1 00:07:10.622 --rc geninfo_all_blocks=1 00:07:10.622 --rc geninfo_unexecuted_blocks=1 00:07:10.622 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.622 ' 00:07:10.622 08:19:23 app_cmdline -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:10.622 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.622 --rc genhtml_branch_coverage=1 00:07:10.622 --rc genhtml_function_coverage=1 00:07:10.622 --rc genhtml_legend=1 00:07:10.622 --rc geninfo_all_blocks=1 00:07:10.622 --rc geninfo_unexecuted_blocks=1 00:07:10.622 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.622 ' 00:07:10.622 08:19:23 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:10.622 08:19:23 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=988917 00:07:10.622 08:19:23 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:10.622 08:19:23 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 988917 00:07:10.622 08:19:23 app_cmdline -- common/autotest_common.sh@831 -- # '[' -z 988917 ']' 00:07:10.622 08:19:23 app_cmdline -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:10.622 08:19:23 app_cmdline -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:10.622 08:19:23 app_cmdline -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:10.622 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:10.622 08:19:23 app_cmdline -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:10.622 08:19:23 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:10.622 [2024-11-17 08:19:23.705552] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:10.622 [2024-11-17 08:19:23.705632] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid988917 ] 00:07:10.882 [2024-11-17 08:19:23.772812] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.882 [2024-11-17 08:19:23.810780] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.882 08:19:23 app_cmdline -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:10.882 08:19:23 app_cmdline -- common/autotest_common.sh@864 -- # return 0 00:07:10.882 08:19:23 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:11.142 { 00:07:11.142 "version": "SPDK v24.09.1-pre git sha1 b18e1bd62", 00:07:11.142 "fields": { 00:07:11.142 "major": 24, 00:07:11.142 "minor": 9, 00:07:11.142 "patch": 1, 00:07:11.142 "suffix": "-pre", 00:07:11.142 "commit": "b18e1bd62" 00:07:11.142 } 00:07:11.142 } 00:07:11.142 08:19:24 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:07:11.142 08:19:24 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:11.142 08:19:24 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:11.142 08:19:24 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:11.142 08:19:24 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:11.142 08:19:24 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:11.142 08:19:24 app_cmdline -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:11.142 08:19:24 app_cmdline -- app/cmdline.sh@26 -- # sort 00:07:11.142 08:19:24 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:11.142 08:19:24 app_cmdline -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:11.142 08:19:24 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:11.142 08:19:24 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:11.142 08:19:24 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:11.142 08:19:24 app_cmdline -- common/autotest_common.sh@650 -- # local es=0 00:07:11.142 08:19:24 app_cmdline -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:11.142 08:19:24 app_cmdline -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:11.142 08:19:24 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:11.142 08:19:24 app_cmdline -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:11.142 08:19:24 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:11.142 08:19:24 app_cmdline -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:11.142 08:19:24 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:11.142 08:19:24 app_cmdline -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:11.142 08:19:24 app_cmdline -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:11.142 08:19:24 app_cmdline -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:11.401 request: 00:07:11.401 { 00:07:11.401 "method": "env_dpdk_get_mem_stats", 00:07:11.401 "req_id": 1 00:07:11.401 } 00:07:11.401 Got JSON-RPC error response 00:07:11.401 response: 00:07:11.401 { 00:07:11.401 "code": -32601, 00:07:11.401 "message": "Method not found" 00:07:11.401 } 00:07:11.401 08:19:24 app_cmdline -- common/autotest_common.sh@653 -- # es=1 00:07:11.401 08:19:24 app_cmdline -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:11.401 08:19:24 app_cmdline -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:11.401 08:19:24 app_cmdline -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:11.401 08:19:24 app_cmdline -- app/cmdline.sh@1 -- # killprocess 988917 00:07:11.401 08:19:24 app_cmdline -- common/autotest_common.sh@950 -- # '[' -z 988917 ']' 00:07:11.401 08:19:24 app_cmdline -- common/autotest_common.sh@954 -- # kill -0 988917 00:07:11.401 08:19:24 app_cmdline -- common/autotest_common.sh@955 -- # uname 00:07:11.401 08:19:24 app_cmdline -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:11.401 08:19:24 app_cmdline -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 988917 00:07:11.401 08:19:24 app_cmdline -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:11.401 08:19:24 app_cmdline -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:11.401 08:19:24 app_cmdline -- common/autotest_common.sh@968 -- # echo 'killing process with pid 988917' 00:07:11.401 killing process with pid 988917 00:07:11.401 08:19:24 app_cmdline -- common/autotest_common.sh@969 -- # kill 988917 00:07:11.401 08:19:24 app_cmdline -- common/autotest_common.sh@974 -- # wait 988917 00:07:11.970 00:07:11.970 real 0m1.315s 00:07:11.970 user 0m1.484s 00:07:11.970 sys 0m0.511s 00:07:11.970 08:19:24 app_cmdline -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:11.970 08:19:24 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:11.970 ************************************ 00:07:11.970 END TEST app_cmdline 00:07:11.970 ************************************ 00:07:11.970 08:19:24 -- spdk/autotest.sh@177 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:11.970 08:19:24 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:11.970 08:19:24 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:11.970 08:19:24 -- common/autotest_common.sh@10 -- # set +x 00:07:11.970 ************************************ 00:07:11.970 START TEST version 00:07:11.970 ************************************ 00:07:11.970 08:19:24 version -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:11.970 * Looking for test storage... 00:07:11.970 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:11.970 08:19:24 version -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:11.970 08:19:24 version -- common/autotest_common.sh@1681 -- # lcov --version 00:07:11.970 08:19:24 version -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:11.970 08:19:25 version -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:11.970 08:19:25 version -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:11.970 08:19:25 version -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:11.970 08:19:25 version -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:11.970 08:19:25 version -- scripts/common.sh@336 -- # IFS=.-: 00:07:11.970 08:19:25 version -- scripts/common.sh@336 -- # read -ra ver1 00:07:11.970 08:19:25 version -- scripts/common.sh@337 -- # IFS=.-: 00:07:11.970 08:19:25 version -- scripts/common.sh@337 -- # read -ra ver2 00:07:11.970 08:19:25 version -- scripts/common.sh@338 -- # local 'op=<' 00:07:11.970 08:19:25 version -- scripts/common.sh@340 -- # ver1_l=2 00:07:11.970 08:19:25 version -- scripts/common.sh@341 -- # ver2_l=1 00:07:11.970 08:19:25 version -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:11.970 08:19:25 version -- scripts/common.sh@344 -- # case "$op" in 00:07:11.970 08:19:25 version -- scripts/common.sh@345 -- # : 1 00:07:11.970 08:19:25 version -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:11.970 08:19:25 version -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:11.970 08:19:25 version -- scripts/common.sh@365 -- # decimal 1 00:07:11.970 08:19:25 version -- scripts/common.sh@353 -- # local d=1 00:07:11.970 08:19:25 version -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:11.970 08:19:25 version -- scripts/common.sh@355 -- # echo 1 00:07:11.970 08:19:25 version -- scripts/common.sh@365 -- # ver1[v]=1 00:07:11.970 08:19:25 version -- scripts/common.sh@366 -- # decimal 2 00:07:11.970 08:19:25 version -- scripts/common.sh@353 -- # local d=2 00:07:11.970 08:19:25 version -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:11.970 08:19:25 version -- scripts/common.sh@355 -- # echo 2 00:07:11.970 08:19:25 version -- scripts/common.sh@366 -- # ver2[v]=2 00:07:11.970 08:19:25 version -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:11.970 08:19:25 version -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:11.970 08:19:25 version -- scripts/common.sh@368 -- # return 0 00:07:11.970 08:19:25 version -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:11.970 08:19:25 version -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:11.970 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:11.970 --rc genhtml_branch_coverage=1 00:07:11.970 --rc genhtml_function_coverage=1 00:07:11.970 --rc genhtml_legend=1 00:07:11.970 --rc geninfo_all_blocks=1 00:07:11.970 --rc geninfo_unexecuted_blocks=1 00:07:11.970 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:11.970 ' 00:07:11.970 08:19:25 version -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:11.970 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:11.970 --rc genhtml_branch_coverage=1 00:07:11.970 --rc genhtml_function_coverage=1 00:07:11.970 --rc genhtml_legend=1 00:07:11.970 --rc geninfo_all_blocks=1 00:07:11.970 --rc geninfo_unexecuted_blocks=1 00:07:11.970 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:11.970 ' 00:07:11.970 08:19:25 version -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:11.970 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:11.970 --rc genhtml_branch_coverage=1 00:07:11.970 --rc genhtml_function_coverage=1 00:07:11.970 --rc genhtml_legend=1 00:07:11.970 --rc geninfo_all_blocks=1 00:07:11.970 --rc geninfo_unexecuted_blocks=1 00:07:11.970 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:11.970 ' 00:07:11.970 08:19:25 version -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:11.970 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:11.970 --rc genhtml_branch_coverage=1 00:07:11.970 --rc genhtml_function_coverage=1 00:07:11.970 --rc genhtml_legend=1 00:07:11.970 --rc geninfo_all_blocks=1 00:07:11.970 --rc geninfo_unexecuted_blocks=1 00:07:11.970 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:11.970 ' 00:07:11.970 08:19:25 version -- app/version.sh@17 -- # get_header_version major 00:07:11.970 08:19:25 version -- app/version.sh@14 -- # tr -d '"' 00:07:11.970 08:19:25 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:11.970 08:19:25 version -- app/version.sh@14 -- # cut -f2 00:07:11.970 08:19:25 version -- app/version.sh@17 -- # major=24 00:07:11.970 08:19:25 version -- app/version.sh@18 -- # get_header_version minor 00:07:11.970 08:19:25 version -- app/version.sh@14 -- # cut -f2 00:07:11.970 08:19:25 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:11.970 08:19:25 version -- app/version.sh@14 -- # tr -d '"' 00:07:11.970 08:19:25 version -- app/version.sh@18 -- # minor=9 00:07:11.970 08:19:25 version -- app/version.sh@19 -- # get_header_version patch 00:07:11.970 08:19:25 version -- app/version.sh@14 -- # cut -f2 00:07:11.970 08:19:25 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:11.970 08:19:25 version -- app/version.sh@14 -- # tr -d '"' 00:07:11.970 08:19:25 version -- app/version.sh@19 -- # patch=1 00:07:11.970 08:19:25 version -- app/version.sh@20 -- # get_header_version suffix 00:07:11.970 08:19:25 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:11.970 08:19:25 version -- app/version.sh@14 -- # cut -f2 00:07:11.970 08:19:25 version -- app/version.sh@14 -- # tr -d '"' 00:07:11.970 08:19:25 version -- app/version.sh@20 -- # suffix=-pre 00:07:11.970 08:19:25 version -- app/version.sh@22 -- # version=24.9 00:07:11.970 08:19:25 version -- app/version.sh@25 -- # (( patch != 0 )) 00:07:11.970 08:19:25 version -- app/version.sh@25 -- # version=24.9.1 00:07:11.970 08:19:25 version -- app/version.sh@28 -- # version=24.9.1rc0 00:07:11.970 08:19:25 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:11.970 08:19:25 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:12.231 08:19:25 version -- app/version.sh@30 -- # py_version=24.9.1rc0 00:07:12.231 08:19:25 version -- app/version.sh@31 -- # [[ 24.9.1rc0 == \2\4\.\9\.\1\r\c\0 ]] 00:07:12.231 00:07:12.231 real 0m0.262s 00:07:12.231 user 0m0.152s 00:07:12.231 sys 0m0.155s 00:07:12.231 08:19:25 version -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:12.231 08:19:25 version -- common/autotest_common.sh@10 -- # set +x 00:07:12.231 ************************************ 00:07:12.231 END TEST version 00:07:12.231 ************************************ 00:07:12.231 08:19:25 -- spdk/autotest.sh@179 -- # '[' 0 -eq 1 ']' 00:07:12.231 08:19:25 -- spdk/autotest.sh@188 -- # [[ 0 -eq 1 ]] 00:07:12.231 08:19:25 -- spdk/autotest.sh@194 -- # uname -s 00:07:12.231 08:19:25 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:07:12.231 08:19:25 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:12.231 08:19:25 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:12.231 08:19:25 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:07:12.231 08:19:25 -- spdk/autotest.sh@252 -- # '[' 0 -eq 1 ']' 00:07:12.231 08:19:25 -- spdk/autotest.sh@256 -- # timing_exit lib 00:07:12.231 08:19:25 -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:12.231 08:19:25 -- common/autotest_common.sh@10 -- # set +x 00:07:12.231 08:19:25 -- spdk/autotest.sh@258 -- # '[' 0 -eq 1 ']' 00:07:12.231 08:19:25 -- spdk/autotest.sh@263 -- # '[' 0 -eq 1 ']' 00:07:12.231 08:19:25 -- spdk/autotest.sh@272 -- # '[' 0 -eq 1 ']' 00:07:12.231 08:19:25 -- spdk/autotest.sh@307 -- # '[' 0 -eq 1 ']' 00:07:12.231 08:19:25 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:12.231 08:19:25 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:07:12.231 08:19:25 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:07:12.231 08:19:25 -- spdk/autotest.sh@329 -- # '[' 0 -eq 1 ']' 00:07:12.231 08:19:25 -- spdk/autotest.sh@334 -- # '[' 0 -eq 1 ']' 00:07:12.231 08:19:25 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:07:12.231 08:19:25 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:12.231 08:19:25 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:12.231 08:19:25 -- spdk/autotest.sh@351 -- # '[' 0 -eq 1 ']' 00:07:12.231 08:19:25 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:07:12.231 08:19:25 -- spdk/autotest.sh@362 -- # [[ 0 -eq 1 ]] 00:07:12.231 08:19:25 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:07:12.231 08:19:25 -- spdk/autotest.sh@370 -- # [[ 1 -eq 1 ]] 00:07:12.231 08:19:25 -- spdk/autotest.sh@371 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:12.231 08:19:25 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:12.231 08:19:25 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:12.231 08:19:25 -- common/autotest_common.sh@10 -- # set +x 00:07:12.231 ************************************ 00:07:12.231 START TEST llvm_fuzz 00:07:12.231 ************************************ 00:07:12.231 08:19:25 llvm_fuzz -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:12.231 * Looking for test storage... 00:07:12.231 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:12.231 08:19:25 llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:12.231 08:19:25 llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:07:12.231 08:19:25 llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:12.491 08:19:25 llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:12.491 08:19:25 llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:12.491 08:19:25 llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:12.491 08:19:25 llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:12.491 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.491 --rc genhtml_branch_coverage=1 00:07:12.491 --rc genhtml_function_coverage=1 00:07:12.491 --rc genhtml_legend=1 00:07:12.491 --rc geninfo_all_blocks=1 00:07:12.491 --rc geninfo_unexecuted_blocks=1 00:07:12.491 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.491 ' 00:07:12.491 08:19:25 llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:12.491 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.491 --rc genhtml_branch_coverage=1 00:07:12.491 --rc genhtml_function_coverage=1 00:07:12.491 --rc genhtml_legend=1 00:07:12.491 --rc geninfo_all_blocks=1 00:07:12.491 --rc geninfo_unexecuted_blocks=1 00:07:12.491 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.491 ' 00:07:12.492 08:19:25 llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:12.492 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.492 --rc genhtml_branch_coverage=1 00:07:12.492 --rc genhtml_function_coverage=1 00:07:12.492 --rc genhtml_legend=1 00:07:12.492 --rc geninfo_all_blocks=1 00:07:12.492 --rc geninfo_unexecuted_blocks=1 00:07:12.492 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.492 ' 00:07:12.492 08:19:25 llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:12.492 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.492 --rc genhtml_branch_coverage=1 00:07:12.492 --rc genhtml_function_coverage=1 00:07:12.492 --rc genhtml_legend=1 00:07:12.492 --rc geninfo_all_blocks=1 00:07:12.492 --rc geninfo_unexecuted_blocks=1 00:07:12.492 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.492 ' 00:07:12.492 08:19:25 llvm_fuzz -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:12.492 08:19:25 llvm_fuzz -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:12.492 08:19:25 llvm_fuzz -- common/autotest_common.sh@548 -- # fuzzers=() 00:07:12.492 08:19:25 llvm_fuzz -- common/autotest_common.sh@548 -- # local fuzzers 00:07:12.492 08:19:25 llvm_fuzz -- common/autotest_common.sh@550 -- # [[ -n '' ]] 00:07:12.492 08:19:25 llvm_fuzz -- common/autotest_common.sh@553 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:12.492 08:19:25 llvm_fuzz -- common/autotest_common.sh@554 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:12.492 08:19:25 llvm_fuzz -- common/autotest_common.sh@557 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:12.492 08:19:25 llvm_fuzz -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:12.492 08:19:25 llvm_fuzz -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:12.492 08:19:25 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:12.492 08:19:25 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:12.492 08:19:25 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:12.492 08:19:25 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:12.492 08:19:25 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:12.492 08:19:25 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:12.492 08:19:25 llvm_fuzz -- fuzz/llvm.sh@19 -- # run_test nvmf_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:12.492 08:19:25 llvm_fuzz -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:12.492 08:19:25 llvm_fuzz -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:12.492 08:19:25 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:07:12.492 ************************************ 00:07:12.492 START TEST nvmf_llvm_fuzz 00:07:12.492 ************************************ 00:07:12.492 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:12.492 * Looking for test storage... 00:07:12.492 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:12.492 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:12.492 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:07:12.492 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:12.755 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.755 --rc genhtml_branch_coverage=1 00:07:12.755 --rc genhtml_function_coverage=1 00:07:12.755 --rc genhtml_legend=1 00:07:12.755 --rc geninfo_all_blocks=1 00:07:12.755 --rc geninfo_unexecuted_blocks=1 00:07:12.755 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.755 ' 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:12.755 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.755 --rc genhtml_branch_coverage=1 00:07:12.755 --rc genhtml_function_coverage=1 00:07:12.755 --rc genhtml_legend=1 00:07:12.755 --rc geninfo_all_blocks=1 00:07:12.755 --rc geninfo_unexecuted_blocks=1 00:07:12.755 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.755 ' 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:12.755 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.755 --rc genhtml_branch_coverage=1 00:07:12.755 --rc genhtml_function_coverage=1 00:07:12.755 --rc genhtml_legend=1 00:07:12.755 --rc geninfo_all_blocks=1 00:07:12.755 --rc geninfo_unexecuted_blocks=1 00:07:12.755 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.755 ' 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:12.755 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.755 --rc genhtml_branch_coverage=1 00:07:12.755 --rc genhtml_function_coverage=1 00:07:12.755 --rc genhtml_legend=1 00:07:12.755 --rc geninfo_all_blocks=1 00:07:12.755 --rc geninfo_unexecuted_blocks=1 00:07:12.755 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.755 ' 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@60 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_AIO_FSDEV=y 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_UBLK=y 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_ISAL_CRYPTO=y 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_OPENSSL_PATH= 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OCF=n 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_FUSE=n 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_VTUNE_DIR= 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER=y 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FSDEV=y 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_CRYPTO=n 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_PGO_USE=n 00:07:12.755 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_VHOST=y 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_DAOS=n 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DAOS_DIR= 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_UNIT_TESTS=n 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_VIRTIO=y 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_DPDK_UADK=n 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_COVERAGE=y 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_RDMA=y 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_LZ4=n 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_URING_PATH= 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_XNVME=n 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_VFIO_USER=y 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_ARCH=native 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_HAVE_EVP_MAC=y 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_URING_ZNS=n 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_WERROR=y 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_HAVE_LIBBSD=n 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_UBSAN=y 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_IPSEC_MB_DIR= 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_GOLANG=n 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_ISAL=y 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_IDXD_KERNEL=y 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_RDMA_PROV=verbs 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_APPS=y 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_SHARED=n 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_HAVE_KEYUTILS=y 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_FC_PATH= 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_FC=n 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_AVAHI=n 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_FIO_PLUGIN=y 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_RAID5F=n 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_EXAMPLES=y 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_TESTS=y 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_CRYPTO_MLX5=n 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_MAX_LCORES=128 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_IPSEC_MB=n 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_PGO_DIR= 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_DEBUG=y 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_CROSS_PREFIX= 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_COPY_FILE_RANGE=y 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_URING=n 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:12.756 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:12.756 #define SPDK_CONFIG_H 00:07:12.756 #define SPDK_CONFIG_AIO_FSDEV 1 00:07:12.756 #define SPDK_CONFIG_APPS 1 00:07:12.756 #define SPDK_CONFIG_ARCH native 00:07:12.756 #undef SPDK_CONFIG_ASAN 00:07:12.756 #undef SPDK_CONFIG_AVAHI 00:07:12.756 #undef SPDK_CONFIG_CET 00:07:12.756 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:07:12.756 #define SPDK_CONFIG_COVERAGE 1 00:07:12.756 #define SPDK_CONFIG_CROSS_PREFIX 00:07:12.756 #undef SPDK_CONFIG_CRYPTO 00:07:12.756 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:12.756 #undef SPDK_CONFIG_CUSTOMOCF 00:07:12.756 #undef SPDK_CONFIG_DAOS 00:07:12.756 #define SPDK_CONFIG_DAOS_DIR 00:07:12.756 #define SPDK_CONFIG_DEBUG 1 00:07:12.756 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:12.756 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:12.756 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:12.756 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:12.756 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:12.756 #undef SPDK_CONFIG_DPDK_UADK 00:07:12.756 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:12.756 #define SPDK_CONFIG_EXAMPLES 1 00:07:12.756 #undef SPDK_CONFIG_FC 00:07:12.756 #define SPDK_CONFIG_FC_PATH 00:07:12.756 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:12.756 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:12.756 #define SPDK_CONFIG_FSDEV 1 00:07:12.756 #undef SPDK_CONFIG_FUSE 00:07:12.756 #define SPDK_CONFIG_FUZZER 1 00:07:12.756 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:12.756 #undef SPDK_CONFIG_GOLANG 00:07:12.756 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:12.756 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:07:12.756 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:12.756 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:07:12.756 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:12.756 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:12.756 #undef SPDK_CONFIG_HAVE_LZ4 00:07:12.756 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:07:12.756 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:07:12.756 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:12.756 #define SPDK_CONFIG_IDXD 1 00:07:12.756 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:12.756 #undef SPDK_CONFIG_IPSEC_MB 00:07:12.756 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:12.756 #define SPDK_CONFIG_ISAL 1 00:07:12.756 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:12.756 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:12.756 #define SPDK_CONFIG_LIBDIR 00:07:12.756 #undef SPDK_CONFIG_LTO 00:07:12.756 #define SPDK_CONFIG_MAX_LCORES 128 00:07:12.756 #define SPDK_CONFIG_NVME_CUSE 1 00:07:12.756 #undef SPDK_CONFIG_OCF 00:07:12.756 #define SPDK_CONFIG_OCF_PATH 00:07:12.756 #define SPDK_CONFIG_OPENSSL_PATH 00:07:12.756 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:12.756 #define SPDK_CONFIG_PGO_DIR 00:07:12.756 #undef SPDK_CONFIG_PGO_USE 00:07:12.756 #define SPDK_CONFIG_PREFIX /usr/local 00:07:12.756 #undef SPDK_CONFIG_RAID5F 00:07:12.756 #undef SPDK_CONFIG_RBD 00:07:12.756 #define SPDK_CONFIG_RDMA 1 00:07:12.756 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:12.756 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:12.756 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:12.756 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:12.756 #undef SPDK_CONFIG_SHARED 00:07:12.756 #undef SPDK_CONFIG_SMA 00:07:12.756 #define SPDK_CONFIG_TESTS 1 00:07:12.756 #undef SPDK_CONFIG_TSAN 00:07:12.756 #define SPDK_CONFIG_UBLK 1 00:07:12.756 #define SPDK_CONFIG_UBSAN 1 00:07:12.756 #undef SPDK_CONFIG_UNIT_TESTS 00:07:12.756 #undef SPDK_CONFIG_URING 00:07:12.756 #define SPDK_CONFIG_URING_PATH 00:07:12.756 #undef SPDK_CONFIG_URING_ZNS 00:07:12.757 #undef SPDK_CONFIG_USDT 00:07:12.757 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:12.757 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:12.757 #define SPDK_CONFIG_VFIO_USER 1 00:07:12.757 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:12.757 #define SPDK_CONFIG_VHOST 1 00:07:12.757 #define SPDK_CONFIG_VIRTIO 1 00:07:12.757 #undef SPDK_CONFIG_VTUNE 00:07:12.757 #define SPDK_CONFIG_VTUNE_DIR 00:07:12.757 #define SPDK_CONFIG_WERROR 1 00:07:12.757 #define SPDK_CONFIG_WPDK_DIR 00:07:12.757 #undef SPDK_CONFIG_XNVME 00:07:12.757 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # uname -s 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:07:12.757 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@140 -- # : v23.11 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@179 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@179 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@180 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@180 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@185 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@185 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@189 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@189 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@193 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@193 -- # PYTHONDONTWRITEBYTECODE=1 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@197 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@197 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@198 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@198 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@202 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@203 -- # rm -rf /var/tmp/asan_suppression_file 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@204 -- # cat 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@240 -- # echo leak:libfuse3.so 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@242 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:12.758 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@242 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # '[' -z /var/spdk/dependencies ']' 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@249 -- # export DEPENDENCY_DIR 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@253 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@253 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@254 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@254 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@257 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@257 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@258 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@258 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@263 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@263 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # _LCOV_MAIN=0 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@266 -- # _LCOV_LLVM=1 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV= 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ '' == *clang* ]] 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ 1 -eq 1 ]] 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV=1 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@271 -- # _lcov_opt[_LCOV_MAIN]= 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@273 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@276 -- # '[' 0 -eq 0 ']' 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@277 -- # export valgrind= 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@277 -- # valgrind= 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@283 -- # uname -s 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@283 -- # '[' Linux = Linux ']' 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@284 -- # HUGEMEM=4096 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # export CLEAR_HUGE=yes 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # CLEAR_HUGE=yes 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # MAKE=make 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@288 -- # MAKEFLAGS=-j112 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@304 -- # export HUGEMEM=4096 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@304 -- # HUGEMEM=4096 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # NO_HUGE=() 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@307 -- # TEST_MODE= 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@329 -- # [[ -z 989364 ]] 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@329 -- # kill -0 989364 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1666 -- # set_test_storage 2147483648 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@339 -- # [[ -v testdir ]] 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@341 -- # local requested_size=2147483648 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@342 -- # local mount target_dir 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@344 -- # local -A mounts fss sizes avails uses 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@345 -- # local source fs size avail mount use 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@347 -- # local storage_fallback storage_candidates 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@349 -- # mktemp -udt spdk.XXXXXX 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@349 -- # storage_fallback=/tmp/spdk.6fOJlm 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@354 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@356 -- # [[ -n '' ]] 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@361 -- # [[ -n '' ]] 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@366 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.6fOJlm/tests/nvmf /tmp/spdk.6fOJlm 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@369 -- # requested_size=2214592512 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@338 -- # df -T 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@338 -- # grep -v Filesystem 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_devtmpfs 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=devtmpfs 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=67108864 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=67108864 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=0 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=/dev/pmem0 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=ext2 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=4096 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=5284429824 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=5284425728 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_root 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=overlay 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=52391424000 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=61730594816 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=9339170816 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30861869056 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865297408 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=3428352 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=12340121600 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=12346122240 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=6000640 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:12.759 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30865092608 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865297408 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=204800 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=6173044736 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=6173057024 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=12288 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@377 -- # printf '* Looking for test storage...\n' 00:07:12.760 * Looking for test storage... 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@379 -- # local target_space new_size 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@380 -- # for target_dir in "${storage_candidates[@]}" 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@383 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@383 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@383 -- # mount=/ 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # target_space=52391424000 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@386 -- # (( target_space == 0 || target_space < requested_size )) 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@389 -- # (( target_space >= requested_size )) 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == tmpfs ]] 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == ramfs ]] 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ / == / ]] 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@392 -- # new_size=11553763328 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@398 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@398 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@399 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:12.760 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # return 0 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1668 -- # set -o errtrace 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1669 -- # shopt -s extdebug 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1670 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1672 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1673 -- # true 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1675 -- # xtrace_fd 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:12.760 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.760 --rc genhtml_branch_coverage=1 00:07:12.760 --rc genhtml_function_coverage=1 00:07:12.760 --rc genhtml_legend=1 00:07:12.760 --rc geninfo_all_blocks=1 00:07:12.760 --rc geninfo_unexecuted_blocks=1 00:07:12.760 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.760 ' 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:12.760 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.760 --rc genhtml_branch_coverage=1 00:07:12.760 --rc genhtml_function_coverage=1 00:07:12.760 --rc genhtml_legend=1 00:07:12.760 --rc geninfo_all_blocks=1 00:07:12.760 --rc geninfo_unexecuted_blocks=1 00:07:12.760 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.760 ' 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:12.760 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.760 --rc genhtml_branch_coverage=1 00:07:12.760 --rc genhtml_function_coverage=1 00:07:12.760 --rc genhtml_legend=1 00:07:12.760 --rc geninfo_all_blocks=1 00:07:12.760 --rc geninfo_unexecuted_blocks=1 00:07:12.760 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.760 ' 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:12.760 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.760 --rc genhtml_branch_coverage=1 00:07:12.760 --rc genhtml_function_coverage=1 00:07:12.760 --rc genhtml_legend=1 00:07:12.760 --rc geninfo_all_blocks=1 00:07:12.760 --rc geninfo_unexecuted_blocks=1 00:07:12.760 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.760 ' 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@61 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@63 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # fuzz_num=25 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@65 -- # (( fuzz_num != 0 )) 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@67 -- # trap 'cleanup /tmp/llvm_fuzz* /var/tmp/suppress_nvmf_fuzz; exit 1' SIGINT SIGTERM EXIT 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@69 -- # mem_size=512 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@70 -- # [[ 1 -eq 1 ]] 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@71 -- # start_llvm_fuzz_short 25 1 00:07:12.760 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=25 00:07:12.761 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:07:12.761 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:07:12.761 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:12.761 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:12.761 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:12.761 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:12.761 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:12.761 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:12.761 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:12.761 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:12.761 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:12.761 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 0 00:07:12.761 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4400 00:07:12.761 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:12.761 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:12.761 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:13.020 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:13.020 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:13.020 08:19:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 00:07:13.020 [2024-11-17 08:19:25.920943] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:13.020 [2024-11-17 08:19:25.921011] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid989438 ] 00:07:13.020 [2024-11-17 08:19:26.094999] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.020 [2024-11-17 08:19:26.117268] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.280 [2024-11-17 08:19:26.170150] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:13.280 [2024-11-17 08:19:26.186485] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:13.280 INFO: Running with entropic power schedule (0xFF, 100). 00:07:13.280 INFO: Seed: 294482166 00:07:13.280 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:13.280 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:13.280 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:13.280 INFO: A corpus is not provided, starting from an empty corpus 00:07:13.280 #2 INITED exec/s: 0 rss: 65Mb 00:07:13.280 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:13.280 This may also happen if the target rejected all inputs we tried so far 00:07:13.280 [2024-11-17 08:19:26.252585] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:13.280 [2024-11-17 08:19:26.252621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.539 NEW_FUNC[1/714]: 0x459648 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:13.539 NEW_FUNC[2/714]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:13.539 #3 NEW cov: 12149 ft: 12153 corp: 2/98b lim: 320 exec/s: 0 rss: 71Mb L: 97/97 MS: 1 InsertRepeatedBytes- 00:07:13.539 [2024-11-17 08:19:26.583363] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:13.539 [2024-11-17 08:19:26.583406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.539 #4 NEW cov: 12265 ft: 12688 corp: 3/195b lim: 320 exec/s: 0 rss: 72Mb L: 97/97 MS: 1 CopyPart- 00:07:13.539 [2024-11-17 08:19:26.643436] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:13.539 [2024-11-17 08:19:26.643464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.539 #5 NEW cov: 12271 ft: 13046 corp: 4/292b lim: 320 exec/s: 0 rss: 72Mb L: 97/97 MS: 1 ChangeBinInt- 00:07:13.799 [2024-11-17 08:19:26.683570] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:13.799 [2024-11-17 08:19:26.683599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.799 #6 NEW cov: 12356 ft: 13272 corp: 5/389b lim: 320 exec/s: 0 rss: 72Mb L: 97/97 MS: 1 ChangeBit- 00:07:13.799 [2024-11-17 08:19:26.743700] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xe3f1f1f1f1f1f1f1 00:07:13.799 [2024-11-17 08:19:26.743728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.799 #12 NEW cov: 12356 ft: 13360 corp: 6/486b lim: 320 exec/s: 0 rss: 72Mb L: 97/97 MS: 1 ChangeByte- 00:07:13.799 [2024-11-17 08:19:26.813978] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:13.799 [2024-11-17 08:19:26.814006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.799 #18 NEW cov: 12356 ft: 13497 corp: 7/583b lim: 320 exec/s: 0 rss: 72Mb L: 97/97 MS: 1 ChangeBit- 00:07:13.799 [2024-11-17 08:19:26.854038] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:13.799 [2024-11-17 08:19:26.854066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.799 #19 NEW cov: 12356 ft: 13565 corp: 8/681b lim: 320 exec/s: 0 rss: 72Mb L: 98/98 MS: 1 InsertByte- 00:07:13.799 [2024-11-17 08:19:26.914243] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:13.799 [2024-11-17 08:19:26.914272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.799 #20 NEW cov: 12356 ft: 13594 corp: 9/779b lim: 320 exec/s: 0 rss: 72Mb L: 98/98 MS: 1 InsertByte- 00:07:14.058 [2024-11-17 08:19:26.954335] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:14.058 [2024-11-17 08:19:26.954362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.058 #26 NEW cov: 12356 ft: 13694 corp: 10/876b lim: 320 exec/s: 0 rss: 72Mb L: 97/98 MS: 1 ShuffleBytes- 00:07:14.058 [2024-11-17 08:19:26.994702] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:14.058 [2024-11-17 08:19:26.994730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.058 [2024-11-17 08:19:26.994851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f1) qid:0 cid:5 nsid:f1f1f1f1 cdw10:f1f1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.058 [2024-11-17 08:19:26.994872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.058 NEW_FUNC[1/1]: 0x194dc38 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:07:14.058 #27 NEW cov: 12391 ft: 14323 corp: 11/1048b lim: 320 exec/s: 0 rss: 72Mb L: 172/172 MS: 1 CrossOver- 00:07:14.058 [2024-11-17 08:19:27.054658] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:14.058 [2024-11-17 08:19:27.054683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.058 #28 NEW cov: 12391 ft: 14348 corp: 12/1145b lim: 320 exec/s: 0 rss: 72Mb L: 97/172 MS: 1 ShuffleBytes- 00:07:14.058 [2024-11-17 08:19:27.094680] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:14.058 [2024-11-17 08:19:27.094714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.058 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:14.058 #29 NEW cov: 12414 ft: 14463 corp: 13/1243b lim: 320 exec/s: 0 rss: 72Mb L: 98/172 MS: 1 CopyPart- 00:07:14.058 [2024-11-17 08:19:27.155181] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:14.058 [2024-11-17 08:19:27.155211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.058 [2024-11-17 08:19:27.155326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f1) qid:0 cid:5 nsid:f1f1f1f1 cdw10:f1f1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.058 [2024-11-17 08:19:27.155343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.058 #30 NEW cov: 12414 ft: 14529 corp: 14/1420b lim: 320 exec/s: 0 rss: 72Mb L: 177/177 MS: 1 CopyPart- 00:07:14.058 [2024-11-17 08:19:27.195110] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:14.058 [2024-11-17 08:19:27.195136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.318 #31 NEW cov: 12414 ft: 14602 corp: 15/1517b lim: 320 exec/s: 0 rss: 72Mb L: 97/177 MS: 1 CrossOver- 00:07:14.318 [2024-11-17 08:19:27.235178] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:14.318 [2024-11-17 08:19:27.235205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.318 #32 NEW cov: 12414 ft: 14613 corp: 16/1614b lim: 320 exec/s: 32 rss: 72Mb L: 97/177 MS: 1 ChangeByte- 00:07:14.318 [2024-11-17 08:19:27.275313] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:14.318 [2024-11-17 08:19:27.275339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.318 #33 NEW cov: 12414 ft: 14618 corp: 17/1711b lim: 320 exec/s: 33 rss: 72Mb L: 97/177 MS: 1 ChangeBinInt- 00:07:14.318 [2024-11-17 08:19:27.315640] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:14.318 [2024-11-17 08:19:27.315666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.318 [2024-11-17 08:19:27.315783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f1) qid:0 cid:5 nsid:f1f1f1f1 cdw10:f1f1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.318 [2024-11-17 08:19:27.315799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.318 #34 NEW cov: 12414 ft: 14686 corp: 18/1883b lim: 320 exec/s: 34 rss: 73Mb L: 172/177 MS: 1 ChangeByte- 00:07:14.318 [2024-11-17 08:19:27.375596] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:14.318 [2024-11-17 08:19:27.375622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.318 #35 NEW cov: 12414 ft: 14711 corp: 19/1980b lim: 320 exec/s: 35 rss: 73Mb L: 97/177 MS: 1 ChangeBit- 00:07:14.318 [2024-11-17 08:19:27.435743] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1e1f1f1f1f1f1 00:07:14.318 [2024-11-17 08:19:27.435774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.318 #36 NEW cov: 12414 ft: 14738 corp: 20/2077b lim: 320 exec/s: 36 rss: 73Mb L: 97/177 MS: 1 ChangeBit- 00:07:14.577 [2024-11-17 08:19:27.475865] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:86868686 SGL TRANSPORT DATA BLOCK TRANSPORT 0x8686868686868686 00:07:14.577 [2024-11-17 08:19:27.475893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.577 #39 NEW cov: 12414 ft: 14768 corp: 21/2202b lim: 320 exec/s: 39 rss: 73Mb L: 125/177 MS: 3 InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:07:14.577 [2024-11-17 08:19:27.515938] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:14.578 [2024-11-17 08:19:27.515965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.578 #40 NEW cov: 12414 ft: 14772 corp: 22/2299b lim: 320 exec/s: 40 rss: 73Mb L: 97/177 MS: 1 CopyPart- 00:07:14.578 [2024-11-17 08:19:27.556372] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:14.578 [2024-11-17 08:19:27.556400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.578 [2024-11-17 08:19:27.556524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f1) qid:0 cid:5 nsid:f1f1f1f1 cdw10:f1f1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.578 [2024-11-17 08:19:27.556540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.578 #41 NEW cov: 12414 ft: 14783 corp: 23/2456b lim: 320 exec/s: 41 rss: 73Mb L: 157/177 MS: 1 CrossOver- 00:07:14.578 [2024-11-17 08:19:27.596189] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:14.578 [2024-11-17 08:19:27.596217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.578 #42 NEW cov: 12414 ft: 14804 corp: 24/2553b lim: 320 exec/s: 42 rss: 73Mb L: 97/177 MS: 1 ChangeByte- 00:07:14.578 [2024-11-17 08:19:27.646614] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f100 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:14.578 [2024-11-17 08:19:27.646642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.578 [2024-11-17 08:19:27.646780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f1) qid:0 cid:5 nsid:f1f1f1f1 cdw10:f1f1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.578 [2024-11-17 08:19:27.646798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.578 #43 NEW cov: 12414 ft: 14854 corp: 25/2725b lim: 320 exec/s: 43 rss: 73Mb L: 172/177 MS: 1 CMP- DE: "\011\000\000\000"- 00:07:14.578 [2024-11-17 08:19:27.706481] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:14.578 [2024-11-17 08:19:27.706514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.837 #44 NEW cov: 12414 ft: 14871 corp: 26/2822b lim: 320 exec/s: 44 rss: 73Mb L: 97/177 MS: 1 ChangeBit- 00:07:14.837 [2024-11-17 08:19:27.756638] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f10af1 00:07:14.837 [2024-11-17 08:19:27.756667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.837 #45 NEW cov: 12414 ft: 14892 corp: 27/2949b lim: 320 exec/s: 45 rss: 73Mb L: 127/177 MS: 1 CrossOver- 00:07:14.837 [2024-11-17 08:19:27.796775] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:14.837 [2024-11-17 08:19:27.796804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.837 #46 NEW cov: 12414 ft: 14905 corp: 28/3046b lim: 320 exec/s: 46 rss: 73Mb L: 97/177 MS: 1 ChangeBit- 00:07:14.837 [2024-11-17 08:19:27.866973] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:09f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:14.837 [2024-11-17 08:19:27.867002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.837 #47 NEW cov: 12414 ft: 14942 corp: 29/3144b lim: 320 exec/s: 47 rss: 73Mb L: 98/177 MS: 1 PersAutoDict- DE: "\011\000\000\000"- 00:07:14.837 [2024-11-17 08:19:27.917068] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:14.837 [2024-11-17 08:19:27.917097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.837 #48 NEW cov: 12414 ft: 15017 corp: 30/3241b lim: 320 exec/s: 48 rss: 73Mb L: 97/177 MS: 1 ChangeBit- 00:07:15.097 [2024-11-17 08:19:27.987303] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:15.097 [2024-11-17 08:19:27.987331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.097 #49 NEW cov: 12414 ft: 15025 corp: 31/3308b lim: 320 exec/s: 49 rss: 73Mb L: 67/177 MS: 1 EraseBytes- 00:07:15.097 [2024-11-17 08:19:28.047740] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f100 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:15.097 [2024-11-17 08:19:28.047769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.097 [2024-11-17 08:19:28.047917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f1) qid:0 cid:5 nsid:f1f1f1f1 cdw10:f1f1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.097 [2024-11-17 08:19:28.047933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.097 #50 NEW cov: 12414 ft: 15049 corp: 32/3480b lim: 320 exec/s: 50 rss: 73Mb L: 172/177 MS: 1 CopyPart- 00:07:15.097 [2024-11-17 08:19:28.107581] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f1f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:15.097 [2024-11-17 08:19:28.107621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.097 #51 NEW cov: 12414 ft: 15050 corp: 33/3577b lim: 320 exec/s: 51 rss: 73Mb L: 97/177 MS: 1 ChangeByte- 00:07:15.097 [2024-11-17 08:19:28.147825] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:15.097 [2024-11-17 08:19:28.147853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.097 #52 NEW cov: 12414 ft: 15088 corp: 34/3694b lim: 320 exec/s: 52 rss: 73Mb L: 117/177 MS: 1 InsertRepeatedBytes- 00:07:15.097 [2024-11-17 08:19:28.187859] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xfffffff1f1f1f1f1 00:07:15.097 [2024-11-17 08:19:28.187887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.097 #53 NEW cov: 12414 ft: 15157 corp: 35/3821b lim: 320 exec/s: 53 rss: 73Mb L: 127/177 MS: 1 InsertRepeatedBytes- 00:07:15.357 [2024-11-17 08:19:28.248321] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f1f1f100 SGL TRANSPORT DATA BLOCK TRANSPORT 0xf1f1f1f1f1f1f1f1 00:07:15.357 [2024-11-17 08:19:28.248348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.357 [2024-11-17 08:19:28.248466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f1) qid:0 cid:5 nsid:9f1f1f1 cdw10:f1f1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.357 [2024-11-17 08:19:28.248499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.357 #54 NEW cov: 12414 ft: 15187 corp: 36/3997b lim: 320 exec/s: 27 rss: 73Mb L: 176/177 MS: 1 PersAutoDict- DE: "\011\000\000\000"- 00:07:15.357 #54 DONE cov: 12414 ft: 15187 corp: 36/3997b lim: 320 exec/s: 27 rss: 73Mb 00:07:15.357 ###### Recommended dictionary. ###### 00:07:15.357 "\011\000\000\000" # Uses: 2 00:07:15.357 ###### End of recommended dictionary. ###### 00:07:15.357 Done 54 runs in 2 second(s) 00:07:15.357 08:19:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_0.conf /var/tmp/suppress_nvmf_fuzz 00:07:15.357 08:19:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:15.357 08:19:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:15.357 08:19:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:15.357 08:19:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:15.357 08:19:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:15.357 08:19:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:15.357 08:19:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:15.357 08:19:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:15.357 08:19:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:15.357 08:19:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:15.357 08:19:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 1 00:07:15.357 08:19:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4401 00:07:15.357 08:19:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:15.357 08:19:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:15.357 08:19:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:15.357 08:19:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:15.357 08:19:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:15.357 08:19:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 00:07:15.357 [2024-11-17 08:19:28.438028] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:15.357 [2024-11-17 08:19:28.438101] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid989960 ] 00:07:15.617 [2024-11-17 08:19:28.614251] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:15.617 [2024-11-17 08:19:28.635795] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.617 [2024-11-17 08:19:28.688023] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:15.617 [2024-11-17 08:19:28.704291] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:15.617 INFO: Running with entropic power schedule (0xFF, 100). 00:07:15.617 INFO: Seed: 2813478805 00:07:15.617 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:15.617 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:15.617 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:15.617 INFO: A corpus is not provided, starting from an empty corpus 00:07:15.617 #2 INITED exec/s: 0 rss: 65Mb 00:07:15.617 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:15.617 This may also happen if the target rejected all inputs we tried so far 00:07:15.617 [2024-11-17 08:19:28.749024] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:15.617 [2024-11-17 08:19:28.749101] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:15.617 [2024-11-17 08:19:28.749160] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:15.617 [2024-11-17 08:19:28.749274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a260256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.617 [2024-11-17 08:19:28.749296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.617 [2024-11-17 08:19:28.749327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.617 [2024-11-17 08:19:28.749342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.617 [2024-11-17 08:19:28.749370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.617 [2024-11-17 08:19:28.749384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.136 NEW_FUNC[1/715]: 0x459f48 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:16.136 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:16.136 #11 NEW cov: 12235 ft: 12231 corp: 2/22b lim: 30 exec/s: 0 rss: 72Mb L: 21/21 MS: 4 CrossOver-CopyPart-InsertByte-InsertRepeatedBytes- 00:07:16.136 [2024-11-17 08:19:29.099935] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.136 [2024-11-17 08:19:29.100019] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.136 [2024-11-17 08:19:29.100133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a260256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.136 [2024-11-17 08:19:29.100160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.136 [2024-11-17 08:19:29.100191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.136 [2024-11-17 08:19:29.100206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.136 #17 NEW cov: 12348 ft: 13067 corp: 3/35b lim: 30 exec/s: 0 rss: 72Mb L: 13/21 MS: 1 EraseBytes- 00:07:16.136 [2024-11-17 08:19:29.190062] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004056 00:07:16.136 [2024-11-17 08:19:29.190139] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.136 [2024-11-17 08:19:29.190199] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.136 [2024-11-17 08:19:29.190306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a260256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.136 [2024-11-17 08:19:29.190326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.136 [2024-11-17 08:19:29.190356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.136 [2024-11-17 08:19:29.190372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.136 [2024-11-17 08:19:29.190399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.136 [2024-11-17 08:19:29.190414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.136 #23 NEW cov: 12354 ft: 13282 corp: 4/57b lim: 30 exec/s: 0 rss: 72Mb L: 22/22 MS: 1 InsertByte- 00:07:16.136 [2024-11-17 08:19:29.250164] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004056 00:07:16.136 [2024-11-17 08:19:29.250239] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.136 [2024-11-17 08:19:29.250297] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.136 [2024-11-17 08:19:29.250407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a260256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.136 [2024-11-17 08:19:29.250427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.136 [2024-11-17 08:19:29.250458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.136 [2024-11-17 08:19:29.250473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.136 [2024-11-17 08:19:29.250500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.136 [2024-11-17 08:19:29.250515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.395 #29 NEW cov: 12439 ft: 13609 corp: 5/79b lim: 30 exec/s: 0 rss: 72Mb L: 22/22 MS: 1 CrossOver- 00:07:16.395 [2024-11-17 08:19:29.340403] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.395 [2024-11-17 08:19:29.340475] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.395 [2024-11-17 08:19:29.340532] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.395 [2024-11-17 08:19:29.340641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a260256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.395 [2024-11-17 08:19:29.340661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.395 [2024-11-17 08:19:29.340690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.395 [2024-11-17 08:19:29.340713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.396 [2024-11-17 08:19:29.340741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3b560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.396 [2024-11-17 08:19:29.340760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.396 #30 NEW cov: 12439 ft: 13721 corp: 6/100b lim: 30 exec/s: 0 rss: 72Mb L: 21/22 MS: 1 ChangeByte- 00:07:16.396 [2024-11-17 08:19:29.390517] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.396 [2024-11-17 08:19:29.390590] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.396 [2024-11-17 08:19:29.390708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a260256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.396 [2024-11-17 08:19:29.390730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.396 [2024-11-17 08:19:29.390761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.396 [2024-11-17 08:19:29.390775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.396 #31 NEW cov: 12439 ft: 13829 corp: 7/113b lim: 30 exec/s: 0 rss: 72Mb L: 13/22 MS: 1 ChangeBit- 00:07:16.396 [2024-11-17 08:19:29.480762] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.396 [2024-11-17 08:19:29.480835] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.396 [2024-11-17 08:19:29.480893] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.396 [2024-11-17 08:19:29.481004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a260256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.396 [2024-11-17 08:19:29.481024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.396 [2024-11-17 08:19:29.481054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.396 [2024-11-17 08:19:29.481068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.396 [2024-11-17 08:19:29.481095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:56560276 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.396 [2024-11-17 08:19:29.481110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.655 #32 NEW cov: 12439 ft: 13987 corp: 8/131b lim: 30 exec/s: 0 rss: 72Mb L: 18/22 MS: 1 CrossOver- 00:07:16.655 [2024-11-17 08:19:29.570996] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000c5c5 00:07:16.655 [2024-11-17 08:19:29.571069] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000a0a 00:07:16.655 [2024-11-17 08:19:29.571180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:c5c581c5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.655 [2024-11-17 08:19:29.571200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.655 [2024-11-17 08:19:29.571229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:c5c581c5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.655 [2024-11-17 08:19:29.571244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.655 #34 NEW cov: 12439 ft: 14011 corp: 9/143b lim: 30 exec/s: 0 rss: 72Mb L: 12/22 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:16.655 [2024-11-17 08:19:29.631145] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (202520) > buf size (4096) 00:07:16.655 [2024-11-17 08:19:29.631217] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xc0a 00:07:16.655 [2024-11-17 08:19:29.631331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:c5c500c5 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.655 [2024-11-17 08:19:29.631352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.655 [2024-11-17 08:19:29.631381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.655 [2024-11-17 08:19:29.631396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.655 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:16.655 #35 NEW cov: 12485 ft: 14105 corp: 10/155b lim: 30 exec/s: 0 rss: 72Mb L: 12/22 MS: 1 ChangeBinInt- 00:07:16.655 [2024-11-17 08:19:29.721384] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (202520) > buf size (4096) 00:07:16.655 [2024-11-17 08:19:29.721456] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xc0a 00:07:16.655 [2024-11-17 08:19:29.721562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:c5c500c5 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.655 [2024-11-17 08:19:29.721581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.655 [2024-11-17 08:19:29.721612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.655 [2024-11-17 08:19:29.721626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.915 #36 NEW cov: 12485 ft: 14214 corp: 11/167b lim: 30 exec/s: 36 rss: 73Mb L: 12/22 MS: 1 ChangeBit- 00:07:16.915 [2024-11-17 08:19:29.811635] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.915 [2024-11-17 08:19:29.811715] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (612700) > buf size (4096) 00:07:16.915 [2024-11-17 08:19:29.811774] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.915 [2024-11-17 08:19:29.811886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a260256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.915 [2024-11-17 08:19:29.811906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.915 [2024-11-17 08:19:29.811936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.915 [2024-11-17 08:19:29.811951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.915 [2024-11-17 08:19:29.811978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:56560276 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.915 [2024-11-17 08:19:29.811993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.915 #37 NEW cov: 12485 ft: 14233 corp: 12/185b lim: 30 exec/s: 37 rss: 73Mb L: 18/22 MS: 1 ChangeByte- 00:07:16.915 [2024-11-17 08:19:29.901863] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.915 [2024-11-17 08:19:29.901936] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (612700) > buf size (4096) 00:07:16.915 [2024-11-17 08:19:29.901993] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.915 [2024-11-17 08:19:29.902098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a260256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.915 [2024-11-17 08:19:29.902118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.915 [2024-11-17 08:19:29.902153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.915 [2024-11-17 08:19:29.902169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.915 [2024-11-17 08:19:29.902196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:56560276 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.915 [2024-11-17 08:19:29.902211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.915 #38 NEW cov: 12485 ft: 14321 corp: 13/203b lim: 30 exec/s: 38 rss: 73Mb L: 18/22 MS: 1 ShuffleBytes- 00:07:16.915 [2024-11-17 08:19:29.992881] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.915 [2024-11-17 08:19:29.992998] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.915 [2024-11-17 08:19:29.993110] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (612700) > buf size (4096) 00:07:16.915 [2024-11-17 08:19:29.993216] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.916 [2024-11-17 08:19:29.993448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a260256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.916 [2024-11-17 08:19:29.993503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.916 [2024-11-17 08:19:29.993588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.916 [2024-11-17 08:19:29.993617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.916 [2024-11-17 08:19:29.993709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.916 [2024-11-17 08:19:29.993738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.916 [2024-11-17 08:19:29.993817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.916 [2024-11-17 08:19:29.993845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.916 #39 NEW cov: 12485 ft: 14972 corp: 14/228b lim: 30 exec/s: 39 rss: 73Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:07:16.916 [2024-11-17 08:19:30.042793] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004056 00:07:16.916 [2024-11-17 08:19:30.042911] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.916 [2024-11-17 08:19:30.043011] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:16.916 [2024-11-17 08:19:30.043200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a260256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.916 [2024-11-17 08:19:30.043227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.916 [2024-11-17 08:19:30.043280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.916 [2024-11-17 08:19:30.043295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.916 [2024-11-17 08:19:30.043347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.916 [2024-11-17 08:19:30.043361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.175 #45 NEW cov: 12485 ft: 15034 corp: 15/250b lim: 30 exec/s: 45 rss: 73Mb L: 22/25 MS: 1 ShuffleBytes- 00:07:17.175 [2024-11-17 08:19:30.102923] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100005656 00:07:17.175 [2024-11-17 08:19:30.103033] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (612700) > buf size (4096) 00:07:17.175 [2024-11-17 08:19:30.103135] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:17.175 [2024-11-17 08:19:30.103336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a268156 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.175 [2024-11-17 08:19:30.103362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.175 [2024-11-17 08:19:30.103418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.175 [2024-11-17 08:19:30.103433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.175 [2024-11-17 08:19:30.103484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:56560276 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.175 [2024-11-17 08:19:30.103499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.176 #46 NEW cov: 12485 ft: 15141 corp: 16/268b lim: 30 exec/s: 46 rss: 73Mb L: 18/25 MS: 1 ChangeBinInt- 00:07:17.176 [2024-11-17 08:19:30.142993] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:17.176 [2024-11-17 08:19:30.143104] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x5656 00:07:17.176 [2024-11-17 08:19:30.143303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a260256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.176 [2024-11-17 08:19:30.143329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.176 [2024-11-17 08:19:30.143386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.176 [2024-11-17 08:19:30.143401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.176 #47 NEW cov: 12485 ft: 15167 corp: 17/283b lim: 30 exec/s: 47 rss: 73Mb L: 15/25 MS: 1 EraseBytes- 00:07:17.176 [2024-11-17 08:19:30.203200] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:17.176 [2024-11-17 08:19:30.203314] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:17.176 [2024-11-17 08:19:30.203418] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a56 00:07:17.176 [2024-11-17 08:19:30.203521] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000560a 00:07:17.176 [2024-11-17 08:19:30.203737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a260256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.176 [2024-11-17 08:19:30.203763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.176 [2024-11-17 08:19:30.203818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.176 [2024-11-17 08:19:30.203832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.176 [2024-11-17 08:19:30.203883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3b560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.176 [2024-11-17 08:19:30.203897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.176 [2024-11-17 08:19:30.203953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.176 [2024-11-17 08:19:30.203967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.176 #48 NEW cov: 12485 ft: 15216 corp: 18/307b lim: 30 exec/s: 48 rss: 73Mb L: 24/25 MS: 1 CopyPart- 00:07:17.176 [2024-11-17 08:19:30.263355] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100005656 00:07:17.176 [2024-11-17 08:19:30.263469] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:17.176 [2024-11-17 08:19:30.263571] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:17.176 [2024-11-17 08:19:30.263778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a268156 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.176 [2024-11-17 08:19:30.263803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.176 [2024-11-17 08:19:30.263857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:3a560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.176 [2024-11-17 08:19:30.263872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.176 [2024-11-17 08:19:30.263923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:78560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.176 [2024-11-17 08:19:30.263937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.176 #49 NEW cov: 12485 ft: 15237 corp: 19/326b lim: 30 exec/s: 49 rss: 73Mb L: 19/25 MS: 1 InsertByte- 00:07:17.436 [2024-11-17 08:19:30.323474] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:17.436 [2024-11-17 08:19:30.323584] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:17.436 [2024-11-17 08:19:30.323784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a260256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.436 [2024-11-17 08:19:30.323808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.436 [2024-11-17 08:19:30.323863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:30560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.436 [2024-11-17 08:19:30.323877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.436 #50 NEW cov: 12485 ft: 15274 corp: 20/339b lim: 30 exec/s: 50 rss: 73Mb L: 13/25 MS: 1 ChangeByte- 00:07:17.436 [2024-11-17 08:19:30.363654] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:17.436 [2024-11-17 08:19:30.363784] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:17.436 [2024-11-17 08:19:30.363890] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (612700) > buf size (4096) 00:07:17.436 [2024-11-17 08:19:30.363989] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:17.436 [2024-11-17 08:19:30.364183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a260256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.436 [2024-11-17 08:19:30.364209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.436 [2024-11-17 08:19:30.364261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.436 [2024-11-17 08:19:30.364276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.436 [2024-11-17 08:19:30.364330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.436 [2024-11-17 08:19:30.364344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.436 [2024-11-17 08:19:30.364396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.436 [2024-11-17 08:19:30.364410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.436 #51 NEW cov: 12485 ft: 15284 corp: 21/364b lim: 30 exec/s: 51 rss: 73Mb L: 25/25 MS: 1 CrossOver- 00:07:17.436 [2024-11-17 08:19:30.403706] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000c5c5 00:07:17.436 [2024-11-17 08:19:30.403815] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000a0a 00:07:17.436 [2024-11-17 08:19:30.404015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:c5c581c5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.436 [2024-11-17 08:19:30.404041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.436 [2024-11-17 08:19:30.404095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:c5c581c5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.436 [2024-11-17 08:19:30.404109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.436 #52 NEW cov: 12485 ft: 15308 corp: 22/376b lim: 30 exec/s: 52 rss: 73Mb L: 12/25 MS: 1 ShuffleBytes- 00:07:17.436 [2024-11-17 08:19:30.443787] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (202520) > buf size (4096) 00:07:17.436 [2024-11-17 08:19:30.443899] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (3288) > len (4) 00:07:17.436 [2024-11-17 08:19:30.444093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:c5c500c5 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.436 [2024-11-17 08:19:30.444119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.436 [2024-11-17 08:19:30.444172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.436 [2024-11-17 08:19:30.444187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.436 #53 NEW cov: 12498 ft: 15347 corp: 23/389b lim: 30 exec/s: 53 rss: 73Mb L: 13/25 MS: 1 InsertByte- 00:07:17.436 [2024-11-17 08:19:30.503991] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:17.436 [2024-11-17 08:19:30.504121] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:17.436 [2024-11-17 08:19:30.504228] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:17.436 [2024-11-17 08:19:30.504432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff0a0226 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.436 [2024-11-17 08:19:30.504457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.436 [2024-11-17 08:19:30.504509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.436 [2024-11-17 08:19:30.504523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.436 [2024-11-17 08:19:30.504574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.436 [2024-11-17 08:19:30.504588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.436 #58 NEW cov: 12498 ft: 15362 corp: 24/408b lim: 30 exec/s: 58 rss: 73Mb L: 19/25 MS: 5 ShuffleBytes-CopyPart-ChangeByte-CopyPart-CrossOver- 00:07:17.436 [2024-11-17 08:19:30.544103] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004056 00:07:17.436 [2024-11-17 08:19:30.544220] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:17.436 [2024-11-17 08:19:30.544327] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:17.436 [2024-11-17 08:19:30.544534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1a260256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.436 [2024-11-17 08:19:30.544559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.436 [2024-11-17 08:19:30.544613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.436 [2024-11-17 08:19:30.544628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.436 [2024-11-17 08:19:30.544680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.436 [2024-11-17 08:19:30.544710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.436 #59 NEW cov: 12498 ft: 15381 corp: 25/430b lim: 30 exec/s: 59 rss: 73Mb L: 22/25 MS: 1 ChangeBit- 00:07:17.697 [2024-11-17 08:19:30.584162] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10588) > buf size (4096) 00:07:17.697 [2024-11-17 08:19:30.584361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a560056 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.697 [2024-11-17 08:19:30.584385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.697 #63 NEW cov: 12498 ft: 15718 corp: 26/441b lim: 30 exec/s: 63 rss: 73Mb L: 11/25 MS: 4 InsertByte-ChangeByte-CopyPart-CrossOver- 00:07:17.697 [2024-11-17 08:19:30.624295] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:17.697 [2024-11-17 08:19:30.624408] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x56f5 00:07:17.697 [2024-11-17 08:19:30.624614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a260256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.697 [2024-11-17 08:19:30.624639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.697 [2024-11-17 08:19:30.624691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.697 [2024-11-17 08:19:30.624710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.697 #64 NEW cov: 12498 ft: 15731 corp: 27/456b lim: 30 exec/s: 64 rss: 73Mb L: 15/25 MS: 1 CMP- DE: "\365\377\377\377"- 00:07:17.697 [2024-11-17 08:19:30.684525] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004056 00:07:17.697 [2024-11-17 08:19:30.684641] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000c5c5 00:07:17.697 [2024-11-17 08:19:30.684945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a260256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.697 [2024-11-17 08:19:30.684971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.697 [2024-11-17 08:19:30.685027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56568156 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.697 [2024-11-17 08:19:30.685045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.697 [2024-11-17 08:19:30.685098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.697 [2024-11-17 08:19:30.685112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.697 #65 NEW cov: 12508 ft: 15748 corp: 28/478b lim: 30 exec/s: 65 rss: 73Mb L: 22/25 MS: 1 CrossOver- 00:07:17.697 [2024-11-17 08:19:30.724620] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:07:17.697 [2024-11-17 08:19:30.724740] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (88412) > buf size (4096) 00:07:17.697 [2024-11-17 08:19:30.724844] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x5656 00:07:17.697 [2024-11-17 08:19:30.725058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a260256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.697 [2024-11-17 08:19:30.725085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.697 [2024-11-17 08:19:30.725140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56560056 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.697 [2024-11-17 08:19:30.725155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.697 [2024-11-17 08:19:30.725210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00560056 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.697 [2024-11-17 08:19:30.725224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.697 #66 NEW cov: 12508 ft: 15768 corp: 29/500b lim: 30 exec/s: 33 rss: 73Mb L: 22/25 MS: 1 CrossOver- 00:07:17.697 #66 DONE cov: 12508 ft: 15768 corp: 29/500b lim: 30 exec/s: 33 rss: 73Mb 00:07:17.697 ###### Recommended dictionary. ###### 00:07:17.697 "\365\377\377\377" # Uses: 0 00:07:17.697 ###### End of recommended dictionary. ###### 00:07:17.697 Done 66 runs in 2 second(s) 00:07:17.957 08:19:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_1.conf /var/tmp/suppress_nvmf_fuzz 00:07:17.957 08:19:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:17.957 08:19:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:17.957 08:19:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:17.957 08:19:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:17.957 08:19:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:17.957 08:19:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:17.957 08:19:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:17.957 08:19:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:17.957 08:19:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:17.957 08:19:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:17.957 08:19:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 2 00:07:17.957 08:19:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4402 00:07:17.957 08:19:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:17.957 08:19:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:17.957 08:19:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:17.957 08:19:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:17.957 08:19:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:17.957 08:19:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 00:07:17.957 [2024-11-17 08:19:30.883362] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:17.957 [2024-11-17 08:19:30.883414] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid990433 ] 00:07:17.957 [2024-11-17 08:19:31.061196] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.957 [2024-11-17 08:19:31.082992] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.217 [2024-11-17 08:19:31.135549] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:18.217 [2024-11-17 08:19:31.151881] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:18.217 INFO: Running with entropic power schedule (0xFF, 100). 00:07:18.217 INFO: Seed: 965509803 00:07:18.217 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:18.217 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:18.217 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:18.217 INFO: A corpus is not provided, starting from an empty corpus 00:07:18.217 #2 INITED exec/s: 0 rss: 65Mb 00:07:18.217 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:18.217 This may also happen if the target rejected all inputs we tried so far 00:07:18.217 [2024-11-17 08:19:31.196629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0500000a cdw11:21000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.217 [2024-11-17 08:19:31.196662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.476 NEW_FUNC[1/714]: 0x45c9f8 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:18.476 NEW_FUNC[2/714]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:18.476 #10 NEW cov: 12179 ft: 12180 corp: 2/8b lim: 35 exec/s: 0 rss: 72Mb L: 7/7 MS: 3 CopyPart-InsertByte-CMP- DE: "\005\000\000\000"- 00:07:18.476 [2024-11-17 08:19:31.537526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6025000a cdw11:00001600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.476 [2024-11-17 08:19:31.537565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.476 #18 NEW cov: 12304 ft: 12810 corp: 3/15b lim: 35 exec/s: 0 rss: 72Mb L: 7/7 MS: 3 InsertByte-CMP-InsertByte- DE: "\026\000\000\000"- 00:07:18.477 [2024-11-17 08:19:31.597523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:fdff000a cdw11:21000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.477 [2024-11-17 08:19:31.597554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.735 #19 NEW cov: 12310 ft: 13173 corp: 4/22b lim: 35 exec/s: 0 rss: 72Mb L: 7/7 MS: 1 ChangeBinInt- 00:07:18.735 [2024-11-17 08:19:31.687679] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:18.735 [2024-11-17 08:19:31.687814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.735 [2024-11-17 08:19:31.687839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.735 #22 NEW cov: 12406 ft: 13498 corp: 5/34b lim: 35 exec/s: 0 rss: 72Mb L: 12/12 MS: 3 ChangeByte-ChangeBit-InsertRepeatedBytes- 00:07:18.735 [2024-11-17 08:19:31.737813] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:18.735 [2024-11-17 08:19:31.737940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.735 [2024-11-17 08:19:31.737964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.735 #23 NEW cov: 12406 ft: 13634 corp: 6/46b lim: 35 exec/s: 0 rss: 72Mb L: 12/12 MS: 1 CopyPart- 00:07:18.735 [2024-11-17 08:19:31.828162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:7f7f000a cdw11:7f007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.735 [2024-11-17 08:19:31.828192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.735 [2024-11-17 08:19:31.828224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:7f7f007f cdw11:7f007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.735 [2024-11-17 08:19:31.828239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.735 #28 NEW cov: 12406 ft: 13978 corp: 7/64b lim: 35 exec/s: 0 rss: 72Mb L: 18/18 MS: 5 CopyPart-ChangeByte-InsertByte-CopyPart-InsertRepeatedBytes- 00:07:18.994 [2024-11-17 08:19:31.878417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.994 [2024-11-17 08:19:31.878447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.994 [2024-11-17 08:19:31.878478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.994 [2024-11-17 08:19:31.878493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.994 [2024-11-17 08:19:31.878521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.994 [2024-11-17 08:19:31.878536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.994 [2024-11-17 08:19:31.878563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.994 [2024-11-17 08:19:31.878579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.994 #29 NEW cov: 12406 ft: 14584 corp: 8/97b lim: 35 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:07:18.994 [2024-11-17 08:19:31.938412] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:18.994 [2024-11-17 08:19:31.938529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:fdff000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.994 [2024-11-17 08:19:31.938551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.994 [2024-11-17 08:19:31.938582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.994 [2024-11-17 08:19:31.938598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.994 #30 NEW cov: 12406 ft: 14678 corp: 9/116b lim: 35 exec/s: 0 rss: 72Mb L: 19/33 MS: 1 InsertRepeatedBytes- 00:07:18.994 [2024-11-17 08:19:32.009311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.995 [2024-11-17 08:19:32.009342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.995 [2024-11-17 08:19:32.009398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.995 [2024-11-17 08:19:32.009412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.995 #31 NEW cov: 12406 ft: 14790 corp: 10/136b lim: 35 exec/s: 0 rss: 72Mb L: 20/33 MS: 1 EraseBytes- 00:07:18.995 [2024-11-17 08:19:32.069732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.995 [2024-11-17 08:19:32.069759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.995 [2024-11-17 08:19:32.069816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.995 [2024-11-17 08:19:32.069830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.995 [2024-11-17 08:19:32.069883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.995 [2024-11-17 08:19:32.069897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.995 [2024-11-17 08:19:32.069951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.995 [2024-11-17 08:19:32.069965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.995 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:18.995 #32 NEW cov: 12423 ft: 14858 corp: 11/166b lim: 35 exec/s: 0 rss: 72Mb L: 30/33 MS: 1 EraseBytes- 00:07:18.995 [2024-11-17 08:19:32.109454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0500000a cdw11:21000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.995 [2024-11-17 08:19:32.109480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.995 #33 NEW cov: 12423 ft: 14931 corp: 12/173b lim: 35 exec/s: 0 rss: 72Mb L: 7/33 MS: 1 ChangeBit- 00:07:19.254 [2024-11-17 08:19:32.149590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6025000a cdw11:00001600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.254 [2024-11-17 08:19:32.149615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.254 #39 NEW cov: 12423 ft: 14953 corp: 13/180b lim: 35 exec/s: 39 rss: 72Mb L: 7/33 MS: 1 ChangeByte- 00:07:19.254 [2024-11-17 08:19:32.209558] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:19.254 [2024-11-17 08:19:32.209782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.254 [2024-11-17 08:19:32.209808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.254 #40 NEW cov: 12423 ft: 14999 corp: 14/192b lim: 35 exec/s: 40 rss: 72Mb L: 12/33 MS: 1 ChangeByte- 00:07:19.254 [2024-11-17 08:19:32.249953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:7f7f000a cdw11:7f007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.254 [2024-11-17 08:19:32.249979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.254 [2024-11-17 08:19:32.250034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:7f7f007f cdw11:7f007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.254 [2024-11-17 08:19:32.250052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.254 #41 NEW cov: 12423 ft: 15008 corp: 15/210b lim: 35 exec/s: 41 rss: 72Mb L: 18/33 MS: 1 ShuffleBytes- 00:07:19.255 [2024-11-17 08:19:32.310247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:7f7f000a cdw11:7f007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.255 [2024-11-17 08:19:32.310274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.255 [2024-11-17 08:19:32.310328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:7f7f007f cdw11:7f007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.255 [2024-11-17 08:19:32.310343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.255 [2024-11-17 08:19:32.310396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:7fd2007f cdw11:00001100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.255 [2024-11-17 08:19:32.310409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.255 #42 NEW cov: 12423 ft: 15198 corp: 16/234b lim: 35 exec/s: 42 rss: 72Mb L: 24/33 MS: 1 InsertRepeatedBytes- 00:07:19.255 [2024-11-17 08:19:32.369992] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:19.255 [2024-11-17 08:19:32.370209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.255 [2024-11-17 08:19:32.370234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.515 #43 NEW cov: 12423 ft: 15224 corp: 17/246b lim: 35 exec/s: 43 rss: 72Mb L: 12/33 MS: 1 PersAutoDict- DE: "\005\000\000\000"- 00:07:19.515 [2024-11-17 08:19:32.430537] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:19.515 [2024-11-17 08:19:32.430767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:7f7f000a cdw11:7f007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.515 [2024-11-17 08:19:32.430791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.515 [2024-11-17 08:19:32.430849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:7f7f007f cdw11:7f007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.515 [2024-11-17 08:19:32.430863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.515 [2024-11-17 08:19:32.430916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0000007f cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.515 [2024-11-17 08:19:32.430930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.515 [2024-11-17 08:19:32.430981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.515 [2024-11-17 08:19:32.430997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.515 #44 NEW cov: 12423 ft: 15235 corp: 18/276b lim: 35 exec/s: 44 rss: 72Mb L: 30/33 MS: 1 InsertRepeatedBytes- 00:07:19.515 [2024-11-17 08:19:32.470833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.515 [2024-11-17 08:19:32.470859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.515 [2024-11-17 08:19:32.470911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.515 [2024-11-17 08:19:32.470928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.515 [2024-11-17 08:19:32.470978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.515 [2024-11-17 08:19:32.470992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.515 [2024-11-17 08:19:32.471043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.515 [2024-11-17 08:19:32.471057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.515 #45 NEW cov: 12423 ft: 15255 corp: 19/306b lim: 35 exec/s: 45 rss: 72Mb L: 30/33 MS: 1 ShuffleBytes- 00:07:19.515 [2024-11-17 08:19:32.531001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.515 [2024-11-17 08:19:32.531028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.515 [2024-11-17 08:19:32.531081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.515 [2024-11-17 08:19:32.531095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.515 [2024-11-17 08:19:32.531147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff0012 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.515 [2024-11-17 08:19:32.531161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.515 [2024-11-17 08:19:32.531210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.515 [2024-11-17 08:19:32.531224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.515 #46 NEW cov: 12423 ft: 15259 corp: 20/336b lim: 35 exec/s: 46 rss: 72Mb L: 30/33 MS: 1 ChangeByte- 00:07:19.515 [2024-11-17 08:19:32.590701] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:19.515 [2024-11-17 08:19:32.591226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.515 [2024-11-17 08:19:32.591255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.515 [2024-11-17 08:19:32.591310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.515 [2024-11-17 08:19:32.591325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.515 [2024-11-17 08:19:32.591378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.515 [2024-11-17 08:19:32.591391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.515 [2024-11-17 08:19:32.591445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:000000ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.515 [2024-11-17 08:19:32.591459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.515 #47 NEW cov: 12423 ft: 15282 corp: 21/364b lim: 35 exec/s: 47 rss: 72Mb L: 28/33 MS: 1 InsertRepeatedBytes- 00:07:19.515 [2024-11-17 08:19:32.630875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0500000a cdw11:21000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.515 [2024-11-17 08:19:32.630900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.515 #48 NEW cov: 12423 ft: 15316 corp: 22/372b lim: 35 exec/s: 48 rss: 72Mb L: 8/33 MS: 1 InsertByte- 00:07:19.775 [2024-11-17 08:19:32.671178] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:19.775 [2024-11-17 08:19:32.671414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:7f7f000a cdw11:7f007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.775 [2024-11-17 08:19:32.671440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.775 [2024-11-17 08:19:32.671494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:7f7f007f cdw11:7f007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.775 [2024-11-17 08:19:32.671509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.775 [2024-11-17 08:19:32.671562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0000007f cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.775 [2024-11-17 08:19:32.671576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.775 [2024-11-17 08:19:32.671632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.775 [2024-11-17 08:19:32.671648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.775 #49 NEW cov: 12423 ft: 15337 corp: 23/403b lim: 35 exec/s: 49 rss: 73Mb L: 31/33 MS: 1 CopyPart- 00:07:19.775 [2024-11-17 08:19:32.731176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0500000a cdw11:21000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.775 [2024-11-17 08:19:32.731202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.775 #50 NEW cov: 12423 ft: 15349 corp: 24/410b lim: 35 exec/s: 50 rss: 73Mb L: 7/33 MS: 1 ShuffleBytes- 00:07:19.775 [2024-11-17 08:19:32.771274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000000d8 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.775 [2024-11-17 08:19:32.771299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.775 #51 NEW cov: 12423 ft: 15390 corp: 25/423b lim: 35 exec/s: 51 rss: 73Mb L: 13/33 MS: 1 InsertByte- 00:07:19.775 [2024-11-17 08:19:32.811386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000000d8 cdw11:7e000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.775 [2024-11-17 08:19:32.811412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.775 #52 NEW cov: 12423 ft: 15426 corp: 26/436b lim: 35 exec/s: 52 rss: 73Mb L: 13/33 MS: 1 ChangeByte- 00:07:19.775 [2024-11-17 08:19:32.871637] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:19.775 [2024-11-17 08:19:32.871859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:7f7f000a cdw11:7f007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.775 [2024-11-17 08:19:32.871884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.775 [2024-11-17 08:19:32.871941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:7f7f007f cdw11:7f007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.775 [2024-11-17 08:19:32.871955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.775 [2024-11-17 08:19:32.872018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.775 [2024-11-17 08:19:32.872034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.035 #53 NEW cov: 12423 ft: 15437 corp: 27/460b lim: 35 exec/s: 53 rss: 73Mb L: 24/33 MS: 1 ChangeBinInt- 00:07:20.035 [2024-11-17 08:19:32.932100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.035 [2024-11-17 08:19:32.932125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.035 [2024-11-17 08:19:32.932178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.035 [2024-11-17 08:19:32.932193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.035 [2024-11-17 08:19:32.932246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.035 [2024-11-17 08:19:32.932260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.035 [2024-11-17 08:19:32.932312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.035 [2024-11-17 08:19:32.932325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.035 #54 NEW cov: 12423 ft: 15466 corp: 28/490b lim: 35 exec/s: 54 rss: 73Mb L: 30/33 MS: 1 CopyPart- 00:07:20.035 [2024-11-17 08:19:32.991841] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:20.035 [2024-11-17 08:19:32.992056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d800003f cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.035 [2024-11-17 08:19:32.992081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.035 [2024-11-17 08:19:32.992137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.035 [2024-11-17 08:19:32.992154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.035 #55 NEW cov: 12423 ft: 15471 corp: 29/504b lim: 35 exec/s: 55 rss: 73Mb L: 14/33 MS: 1 InsertByte- 00:07:20.035 [2024-11-17 08:19:33.052426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.035 [2024-11-17 08:19:33.052452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.035 [2024-11-17 08:19:33.052505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.035 [2024-11-17 08:19:33.052518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.035 [2024-11-17 08:19:33.052570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ff1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.035 [2024-11-17 08:19:33.052583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.035 [2024-11-17 08:19:33.052635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.035 [2024-11-17 08:19:33.052651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.035 #56 NEW cov: 12423 ft: 15475 corp: 30/534b lim: 35 exec/s: 56 rss: 73Mb L: 30/33 MS: 1 ChangeBinInt- 00:07:20.035 [2024-11-17 08:19:33.092086] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:20.035 [2024-11-17 08:19:33.092306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:aa0500b1 cdw11:00000021 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.035 [2024-11-17 08:19:33.092331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.035 [2024-11-17 08:19:33.092387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:7f000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.035 [2024-11-17 08:19:33.092403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.035 #60 NEW cov: 12430 ft: 15492 corp: 31/549b lim: 35 exec/s: 60 rss: 73Mb L: 15/33 MS: 4 EraseBytes-ChangeByte-InsertByte-CrossOver- 00:07:20.035 [2024-11-17 08:19:33.152371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0500000a cdw11:2100001d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.035 [2024-11-17 08:19:33.152397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.295 #61 NEW cov: 12430 ft: 15513 corp: 32/557b lim: 35 exec/s: 30 rss: 73Mb L: 8/33 MS: 1 ChangeByte- 00:07:20.295 #61 DONE cov: 12430 ft: 15513 corp: 32/557b lim: 35 exec/s: 30 rss: 73Mb 00:07:20.295 ###### Recommended dictionary. ###### 00:07:20.295 "\005\000\000\000" # Uses: 1 00:07:20.295 "\026\000\000\000" # Uses: 0 00:07:20.295 ###### End of recommended dictionary. ###### 00:07:20.295 Done 61 runs in 2 second(s) 00:07:20.295 08:19:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_2.conf /var/tmp/suppress_nvmf_fuzz 00:07:20.295 08:19:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:20.295 08:19:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:20.295 08:19:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:20.295 08:19:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:20.295 08:19:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:20.295 08:19:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:20.295 08:19:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:20.295 08:19:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:20.295 08:19:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:20.295 08:19:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:20.295 08:19:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 3 00:07:20.295 08:19:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4403 00:07:20.295 08:19:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:20.295 08:19:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:20.295 08:19:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:20.295 08:19:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:20.295 08:19:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:20.295 08:19:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 00:07:20.295 [2024-11-17 08:19:33.350546] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:20.295 [2024-11-17 08:19:33.350614] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid990776 ] 00:07:20.554 [2024-11-17 08:19:33.523795] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.554 [2024-11-17 08:19:33.545022] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.554 [2024-11-17 08:19:33.597437] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:20.554 [2024-11-17 08:19:33.613766] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:20.554 INFO: Running with entropic power schedule (0xFF, 100). 00:07:20.554 INFO: Seed: 3427536630 00:07:20.554 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:20.554 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:20.554 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:20.554 INFO: A corpus is not provided, starting from an empty corpus 00:07:20.554 #2 INITED exec/s: 0 rss: 65Mb 00:07:20.554 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:20.554 This may also happen if the target rejected all inputs we tried so far 00:07:21.072 NEW_FUNC[1/702]: 0x45e6d8 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:21.072 NEW_FUNC[2/702]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:21.072 #15 NEW cov: 12081 ft: 12082 corp: 2/5b lim: 20 exec/s: 0 rss: 72Mb L: 4/4 MS: 3 InsertByte-InsertByte-InsertByte- 00:07:21.072 NEW_FUNC[1/1]: 0x17bdf98 in nvme_ctrlr_get_ready_timeout /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:1288 00:07:21.072 #17 NEW cov: 12199 ft: 12734 corp: 3/9b lim: 20 exec/s: 0 rss: 72Mb L: 4/4 MS: 2 CrossOver-CopyPart- 00:07:21.072 #19 NEW cov: 12205 ft: 12923 corp: 4/14b lim: 20 exec/s: 0 rss: 72Mb L: 5/5 MS: 2 ChangeBit-CrossOver- 00:07:21.072 #24 NEW cov: 12304 ft: 13567 corp: 5/22b lim: 20 exec/s: 0 rss: 72Mb L: 8/8 MS: 5 EraseBytes-EraseBytes-ChangeByte-CrossOver-InsertRepeatedBytes- 00:07:21.331 #25 NEW cov: 12304 ft: 13689 corp: 6/26b lim: 20 exec/s: 0 rss: 72Mb L: 4/8 MS: 1 ChangeByte- 00:07:21.331 #28 NEW cov: 12304 ft: 13765 corp: 7/35b lim: 20 exec/s: 0 rss: 72Mb L: 9/9 MS: 3 ChangeBinInt-ChangeBit-InsertRepeatedBytes- 00:07:21.331 #29 NEW cov: 12321 ft: 14248 corp: 8/53b lim: 20 exec/s: 0 rss: 72Mb L: 18/18 MS: 1 InsertRepeatedBytes- 00:07:21.331 #30 NEW cov: 12321 ft: 14317 corp: 9/64b lim: 20 exec/s: 0 rss: 72Mb L: 11/18 MS: 1 InsertRepeatedBytes- 00:07:21.591 #31 NEW cov: 12321 ft: 14380 corp: 10/75b lim: 20 exec/s: 0 rss: 73Mb L: 11/18 MS: 1 ChangeBit- 00:07:21.591 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:21.591 #32 NEW cov: 12344 ft: 14437 corp: 11/93b lim: 20 exec/s: 0 rss: 73Mb L: 18/18 MS: 1 ChangeBinInt- 00:07:21.591 #33 NEW cov: 12344 ft: 14468 corp: 12/101b lim: 20 exec/s: 0 rss: 73Mb L: 8/18 MS: 1 CrossOver- 00:07:21.591 #34 NEW cov: 12344 ft: 14533 corp: 13/109b lim: 20 exec/s: 34 rss: 73Mb L: 8/18 MS: 1 ChangeBit- 00:07:21.850 #35 NEW cov: 12344 ft: 14551 corp: 14/114b lim: 20 exec/s: 35 rss: 73Mb L: 5/18 MS: 1 ChangeBit- 00:07:21.850 #36 NEW cov: 12344 ft: 14564 corp: 15/122b lim: 20 exec/s: 36 rss: 73Mb L: 8/18 MS: 1 ChangeBit- 00:07:21.850 #37 NEW cov: 12348 ft: 14734 corp: 16/135b lim: 20 exec/s: 37 rss: 73Mb L: 13/18 MS: 1 EraseBytes- 00:07:21.850 #38 NEW cov: 12348 ft: 14829 corp: 17/139b lim: 20 exec/s: 38 rss: 73Mb L: 4/18 MS: 1 ChangeBit- 00:07:21.851 #39 NEW cov: 12348 ft: 14875 corp: 18/144b lim: 20 exec/s: 39 rss: 73Mb L: 5/18 MS: 1 ChangeByte- 00:07:22.110 #40 NEW cov: 12348 ft: 14889 corp: 19/149b lim: 20 exec/s: 40 rss: 73Mb L: 5/18 MS: 1 InsertByte- 00:07:22.110 #41 NEW cov: 12348 ft: 14897 corp: 20/162b lim: 20 exec/s: 41 rss: 73Mb L: 13/18 MS: 1 InsertRepeatedBytes- 00:07:22.110 #42 NEW cov: 12348 ft: 14944 corp: 21/166b lim: 20 exec/s: 42 rss: 73Mb L: 4/18 MS: 1 ChangeBinInt- 00:07:22.110 #43 NEW cov: 12348 ft: 14980 corp: 22/171b lim: 20 exec/s: 43 rss: 73Mb L: 5/18 MS: 1 ShuffleBytes- 00:07:22.370 #44 NEW cov: 12348 ft: 15013 corp: 23/189b lim: 20 exec/s: 44 rss: 73Mb L: 18/18 MS: 1 ChangeByte- 00:07:22.370 #45 NEW cov: 12348 ft: 15048 corp: 24/196b lim: 20 exec/s: 45 rss: 74Mb L: 7/18 MS: 1 CopyPart- 00:07:22.370 [2024-11-17 08:19:35.395435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:22.370 [2024-11-17 08:19:35.395478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.370 NEW_FUNC[1/19]: 0x134bdb8 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3477 00:07:22.370 NEW_FUNC[2/19]: 0x134c938 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3419 00:07:22.370 #46 NEW cov: 12649 ft: 15359 corp: 25/211b lim: 20 exec/s: 46 rss: 74Mb L: 15/18 MS: 1 InsertRepeatedBytes- 00:07:22.370 #47 NEW cov: 12649 ft: 15367 corp: 26/215b lim: 20 exec/s: 47 rss: 74Mb L: 4/18 MS: 1 CrossOver- 00:07:22.630 #48 NEW cov: 12649 ft: 15396 corp: 27/219b lim: 20 exec/s: 48 rss: 74Mb L: 4/18 MS: 1 CopyPart- 00:07:22.630 #49 NEW cov: 12649 ft: 15415 corp: 28/232b lim: 20 exec/s: 49 rss: 74Mb L: 13/18 MS: 1 ShuffleBytes- 00:07:22.630 #50 NEW cov: 12649 ft: 15429 corp: 29/241b lim: 20 exec/s: 25 rss: 74Mb L: 9/18 MS: 1 CrossOver- 00:07:22.630 #50 DONE cov: 12649 ft: 15429 corp: 29/241b lim: 20 exec/s: 25 rss: 74Mb 00:07:22.630 Done 50 runs in 2 second(s) 00:07:22.890 08:19:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_3.conf /var/tmp/suppress_nvmf_fuzz 00:07:22.890 08:19:35 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:22.890 08:19:35 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:22.890 08:19:35 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:22.890 08:19:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:22.890 08:19:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:22.890 08:19:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:22.890 08:19:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:22.890 08:19:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:22.890 08:19:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:22.890 08:19:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:22.890 08:19:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 4 00:07:22.890 08:19:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4404 00:07:22.890 08:19:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:22.890 08:19:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:22.890 08:19:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:22.890 08:19:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:22.890 08:19:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:22.890 08:19:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 00:07:22.890 [2024-11-17 08:19:35.821852] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:22.891 [2024-11-17 08:19:35.821919] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid991308 ] 00:07:22.891 [2024-11-17 08:19:35.999723] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.891 [2024-11-17 08:19:36.021381] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.150 [2024-11-17 08:19:36.073743] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:23.150 [2024-11-17 08:19:36.090028] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:23.150 INFO: Running with entropic power schedule (0xFF, 100). 00:07:23.150 INFO: Seed: 1607557700 00:07:23.150 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:23.150 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:23.150 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:23.150 INFO: A corpus is not provided, starting from an empty corpus 00:07:23.150 #2 INITED exec/s: 0 rss: 66Mb 00:07:23.150 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:23.150 This may also happen if the target rejected all inputs we tried so far 00:07:23.150 [2024-11-17 08:19:36.139409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.150 [2024-11-17 08:19:36.139437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.150 [2024-11-17 08:19:36.139490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.150 [2024-11-17 08:19:36.139504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.412 NEW_FUNC[1/715]: 0x45f7d8 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:23.412 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:23.412 #9 NEW cov: 12212 ft: 12211 corp: 2/18b lim: 35 exec/s: 0 rss: 73Mb L: 17/17 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:23.412 [2024-11-17 08:19:36.460246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.412 [2024-11-17 08:19:36.460277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.412 [2024-11-17 08:19:36.460329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.412 [2024-11-17 08:19:36.460342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.412 #10 NEW cov: 12325 ft: 12845 corp: 3/38b lim: 35 exec/s: 0 rss: 73Mb L: 20/20 MS: 1 CopyPart- 00:07:23.412 [2024-11-17 08:19:36.520345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:faff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.412 [2024-11-17 08:19:36.520371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.412 [2024-11-17 08:19:36.520423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.412 [2024-11-17 08:19:36.520437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.412 #11 NEW cov: 12331 ft: 13089 corp: 4/55b lim: 35 exec/s: 0 rss: 73Mb L: 17/20 MS: 1 ChangeBinInt- 00:07:23.672 [2024-11-17 08:19:36.560411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.672 [2024-11-17 08:19:36.560439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.672 [2024-11-17 08:19:36.560493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cbffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.672 [2024-11-17 08:19:36.560507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.672 #17 NEW cov: 12416 ft: 13363 corp: 5/73b lim: 35 exec/s: 0 rss: 73Mb L: 18/20 MS: 1 InsertByte- 00:07:23.673 [2024-11-17 08:19:36.600621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:60ff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.673 [2024-11-17 08:19:36.600647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.673 [2024-11-17 08:19:36.600707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cbffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.673 [2024-11-17 08:19:36.600721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.673 #18 NEW cov: 12416 ft: 13495 corp: 6/91b lim: 35 exec/s: 0 rss: 73Mb L: 18/20 MS: 1 ChangeByte- 00:07:23.673 [2024-11-17 08:19:36.661101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.673 [2024-11-17 08:19:36.661127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.673 [2024-11-17 08:19:36.661180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.673 [2024-11-17 08:19:36.661194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.673 [2024-11-17 08:19:36.661244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.673 [2024-11-17 08:19:36.661258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.673 [2024-11-17 08:19:36.661309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.673 [2024-11-17 08:19:36.661321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.673 #19 NEW cov: 12416 ft: 13907 corp: 7/122b lim: 35 exec/s: 0 rss: 73Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:07:23.673 [2024-11-17 08:19:36.700870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:05000aff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.673 [2024-11-17 08:19:36.700896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.673 [2024-11-17 08:19:36.700949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00ff0000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.673 [2024-11-17 08:19:36.700963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.673 #20 NEW cov: 12416 ft: 14005 corp: 8/140b lim: 35 exec/s: 0 rss: 73Mb L: 18/31 MS: 1 CMP- DE: "\005\000\000\000\000\000\000\000"- 00:07:23.673 [2024-11-17 08:19:36.761046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:fffa0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.673 [2024-11-17 08:19:36.761072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.673 [2024-11-17 08:19:36.761125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.673 [2024-11-17 08:19:36.761142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.673 #24 NEW cov: 12416 ft: 14040 corp: 9/156b lim: 35 exec/s: 0 rss: 73Mb L: 16/31 MS: 4 CopyPart-CopyPart-CopyPart-CrossOver- 00:07:23.673 [2024-11-17 08:19:36.801010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000500 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.673 [2024-11-17 08:19:36.801036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.933 #26 NEW cov: 12416 ft: 14767 corp: 10/165b lim: 35 exec/s: 0 rss: 73Mb L: 9/31 MS: 2 CopyPart-PersAutoDict- DE: "\005\000\000\000\000\000\000\000"- 00:07:23.933 [2024-11-17 08:19:36.841279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:05000aff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.933 [2024-11-17 08:19:36.841304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.933 [2024-11-17 08:19:36.841356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00ff0000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.933 [2024-11-17 08:19:36.841369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.933 #27 NEW cov: 12416 ft: 14809 corp: 11/183b lim: 35 exec/s: 0 rss: 73Mb L: 18/31 MS: 1 PersAutoDict- DE: "\005\000\000\000\000\000\000\000"- 00:07:23.933 [2024-11-17 08:19:36.901611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.933 [2024-11-17 08:19:36.901638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.933 [2024-11-17 08:19:36.901691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.933 [2024-11-17 08:19:36.901710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.933 [2024-11-17 08:19:36.901764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.933 [2024-11-17 08:19:36.901777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.933 #29 NEW cov: 12416 ft: 15038 corp: 12/206b lim: 35 exec/s: 0 rss: 73Mb L: 23/31 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:23.933 [2024-11-17 08:19:36.941531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.933 [2024-11-17 08:19:36.941555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.933 [2024-11-17 08:19:36.941608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:cbff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.933 [2024-11-17 08:19:36.941622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.933 #30 NEW cov: 12416 ft: 15143 corp: 13/224b lim: 35 exec/s: 0 rss: 73Mb L: 18/31 MS: 1 ShuffleBytes- 00:07:23.933 [2024-11-17 08:19:36.982005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.933 [2024-11-17 08:19:36.982031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.933 [2024-11-17 08:19:36.982083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.933 [2024-11-17 08:19:36.982100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.933 [2024-11-17 08:19:36.982148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.933 [2024-11-17 08:19:36.982162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.933 [2024-11-17 08:19:36.982213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.933 [2024-11-17 08:19:36.982225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.933 #31 NEW cov: 12416 ft: 15169 corp: 14/255b lim: 35 exec/s: 0 rss: 74Mb L: 31/31 MS: 1 ChangeByte- 00:07:23.933 [2024-11-17 08:19:37.041866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:05000aff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.933 [2024-11-17 08:19:37.041892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.933 [2024-11-17 08:19:37.041945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00ff0000 cdw11:fff90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.933 [2024-11-17 08:19:37.041959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.933 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:23.933 #37 NEW cov: 12439 ft: 15235 corp: 15/273b lim: 35 exec/s: 0 rss: 74Mb L: 18/31 MS: 1 ChangeBinInt- 00:07:24.193 [2024-11-17 08:19:37.082130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:f0ef55a3 cdw11:7e960001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.193 [2024-11-17 08:19:37.082157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.193 [2024-11-17 08:19:37.082210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ff05000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.193 [2024-11-17 08:19:37.082226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.193 [2024-11-17 08:19:37.082277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.193 [2024-11-17 08:19:37.082290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.193 #38 NEW cov: 12439 ft: 15245 corp: 16/299b lim: 35 exec/s: 0 rss: 74Mb L: 26/31 MS: 1 CMP- DE: "U\243\360\357~\226\212\000"- 00:07:24.193 [2024-11-17 08:19:37.122092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00050a00 cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.193 [2024-11-17 08:19:37.122117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.193 [2024-11-17 08:19:37.122170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00ff0000 cdw11:fff90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.193 [2024-11-17 08:19:37.122185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.193 #39 NEW cov: 12439 ft: 15289 corp: 17/317b lim: 35 exec/s: 39 rss: 74Mb L: 18/31 MS: 1 ShuffleBytes- 00:07:24.193 [2024-11-17 08:19:37.182245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00050a00 cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.193 [2024-11-17 08:19:37.182276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.193 [2024-11-17 08:19:37.182329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00ff0000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.193 [2024-11-17 08:19:37.182343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.193 #40 NEW cov: 12439 ft: 15295 corp: 18/335b lim: 35 exec/s: 40 rss: 74Mb L: 18/31 MS: 1 ShuffleBytes- 00:07:24.193 [2024-11-17 08:19:37.242420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:05000aff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.193 [2024-11-17 08:19:37.242446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.193 [2024-11-17 08:19:37.242500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00ff0000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.193 [2024-11-17 08:19:37.242514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.193 #41 NEW cov: 12439 ft: 15368 corp: 19/353b lim: 35 exec/s: 41 rss: 74Mb L: 18/31 MS: 1 ChangeBinInt- 00:07:24.193 [2024-11-17 08:19:37.282883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.193 [2024-11-17 08:19:37.282908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.193 [2024-11-17 08:19:37.282961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.193 [2024-11-17 08:19:37.282975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.193 [2024-11-17 08:19:37.283025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.193 [2024-11-17 08:19:37.283039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.193 [2024-11-17 08:19:37.283092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.193 [2024-11-17 08:19:37.283105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.193 #42 NEW cov: 12439 ft: 15373 corp: 20/385b lim: 35 exec/s: 42 rss: 74Mb L: 32/32 MS: 1 CrossOver- 00:07:24.454 [2024-11-17 08:19:37.342758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ff000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.454 [2024-11-17 08:19:37.342783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.454 [2024-11-17 08:19:37.342838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffcbffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.454 [2024-11-17 08:19:37.342852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.454 #43 NEW cov: 12439 ft: 15385 corp: 21/404b lim: 35 exec/s: 43 rss: 74Mb L: 19/32 MS: 1 InsertByte- 00:07:24.454 [2024-11-17 08:19:37.382633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00050500 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.454 [2024-11-17 08:19:37.382658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.454 #44 NEW cov: 12439 ft: 15424 corp: 22/413b lim: 35 exec/s: 44 rss: 74Mb L: 9/32 MS: 1 CopyPart- 00:07:24.454 [2024-11-17 08:19:37.442989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2dff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.454 [2024-11-17 08:19:37.443016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.454 [2024-11-17 08:19:37.443068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.454 [2024-11-17 08:19:37.443082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.454 #47 NEW cov: 12439 ft: 15434 corp: 23/427b lim: 35 exec/s: 47 rss: 74Mb L: 14/32 MS: 3 CopyPart-InsertByte-CrossOver- 00:07:24.454 [2024-11-17 08:19:37.483386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:05000aff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.454 [2024-11-17 08:19:37.483412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.454 [2024-11-17 08:19:37.483468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00620000 cdw11:62620002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.454 [2024-11-17 08:19:37.483482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.454 [2024-11-17 08:19:37.483534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:62626262 cdw11:62620002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.454 [2024-11-17 08:19:37.483548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.454 [2024-11-17 08:19:37.483601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffff62ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.454 [2024-11-17 08:19:37.483613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.454 #48 NEW cov: 12439 ft: 15459 corp: 24/457b lim: 35 exec/s: 48 rss: 74Mb L: 30/32 MS: 1 InsertRepeatedBytes- 00:07:24.454 [2024-11-17 08:19:37.543239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.454 [2024-11-17 08:19:37.543264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.454 [2024-11-17 08:19:37.543317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0000ff01 cdw11:33ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.454 [2024-11-17 08:19:37.543330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.454 #49 NEW cov: 12439 ft: 15526 corp: 25/475b lim: 35 exec/s: 49 rss: 74Mb L: 18/32 MS: 1 ChangeBinInt- 00:07:24.714 [2024-11-17 08:19:37.603282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:78080513 cdw11:3d7f0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.714 [2024-11-17 08:19:37.603307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.714 #50 NEW cov: 12439 ft: 15589 corp: 26/484b lim: 35 exec/s: 50 rss: 74Mb L: 9/32 MS: 1 CMP- DE: "\023x\010=\177\226\212\000"- 00:07:24.714 [2024-11-17 08:19:37.643697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:f00055a3 cdw11:00960001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.714 [2024-11-17 08:19:37.643722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.714 [2024-11-17 08:19:37.643775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ff05000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.714 [2024-11-17 08:19:37.643791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.714 [2024-11-17 08:19:37.643843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.714 [2024-11-17 08:19:37.643857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.714 #51 NEW cov: 12439 ft: 15596 corp: 27/510b lim: 35 exec/s: 51 rss: 74Mb L: 26/32 MS: 1 CopyPart- 00:07:24.714 [2024-11-17 08:19:37.703900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00050a00 cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.714 [2024-11-17 08:19:37.703925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.715 [2024-11-17 08:19:37.703979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:fff90000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.715 [2024-11-17 08:19:37.703992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.715 [2024-11-17 08:19:37.704044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00ff0000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.715 [2024-11-17 08:19:37.704057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.715 #52 NEW cov: 12439 ft: 15617 corp: 28/535b lim: 35 exec/s: 52 rss: 75Mb L: 25/32 MS: 1 CopyPart- 00:07:24.715 [2024-11-17 08:19:37.763725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:01050500 cdw11:28000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.715 [2024-11-17 08:19:37.763750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.715 #56 NEW cov: 12439 ft: 15623 corp: 29/542b lim: 35 exec/s: 56 rss: 75Mb L: 7/32 MS: 4 EraseBytes-ShuffleBytes-ChangeBinInt-InsertByte- 00:07:24.715 [2024-11-17 08:19:37.804139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:f0ef55a3 cdw11:7e960001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.715 [2024-11-17 08:19:37.804163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.715 [2024-11-17 08:19:37.804217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.715 [2024-11-17 08:19:37.804231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.715 [2024-11-17 08:19:37.804283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ff000000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.715 [2024-11-17 08:19:37.804297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.715 #57 NEW cov: 12439 ft: 15646 corp: 30/568b lim: 35 exec/s: 57 rss: 75Mb L: 26/32 MS: 1 ShuffleBytes- 00:07:24.715 [2024-11-17 08:19:37.844116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.715 [2024-11-17 08:19:37.844141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.715 [2024-11-17 08:19:37.844192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.715 [2024-11-17 08:19:37.844205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.975 #58 NEW cov: 12439 ft: 15658 corp: 31/587b lim: 35 exec/s: 58 rss: 75Mb L: 19/32 MS: 1 EraseBytes- 00:07:24.975 [2024-11-17 08:19:37.904268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:15ff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.975 [2024-11-17 08:19:37.904293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.975 [2024-11-17 08:19:37.904345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:cbff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.975 [2024-11-17 08:19:37.904359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.975 #59 NEW cov: 12439 ft: 15659 corp: 32/605b lim: 35 exec/s: 59 rss: 75Mb L: 18/32 MS: 1 ChangeByte- 00:07:24.975 [2024-11-17 08:19:37.944642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.975 [2024-11-17 08:19:37.944668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.975 [2024-11-17 08:19:37.944730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.975 [2024-11-17 08:19:37.944744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.975 [2024-11-17 08:19:37.944796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.975 [2024-11-17 08:19:37.944810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.975 [2024-11-17 08:19:37.944871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.975 [2024-11-17 08:19:37.944884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.975 #60 NEW cov: 12439 ft: 15662 corp: 33/639b lim: 35 exec/s: 60 rss: 75Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:07:24.975 [2024-11-17 08:19:37.984630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.975 [2024-11-17 08:19:37.984655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.975 [2024-11-17 08:19:37.984709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.975 [2024-11-17 08:19:37.984724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.975 [2024-11-17 08:19:37.984774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.975 [2024-11-17 08:19:37.984788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.975 #61 NEW cov: 12439 ft: 15668 corp: 34/662b lim: 35 exec/s: 61 rss: 75Mb L: 23/34 MS: 1 ShuffleBytes- 00:07:24.975 [2024-11-17 08:19:38.024558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:05000aff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.975 [2024-11-17 08:19:38.024583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.975 [2024-11-17 08:19:38.024634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00ff0000 cdw11:ff050003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.975 [2024-11-17 08:19:38.024648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.975 #62 NEW cov: 12439 ft: 15700 corp: 35/680b lim: 35 exec/s: 62 rss: 75Mb L: 18/34 MS: 1 ChangeBinInt- 00:07:24.975 [2024-11-17 08:19:38.084756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:13780000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.975 [2024-11-17 08:19:38.084781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.975 [2024-11-17 08:19:38.084835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:968a3d7f cdw11:00ff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.975 [2024-11-17 08:19:38.084848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.236 #63 NEW cov: 12439 ft: 15713 corp: 36/696b lim: 35 exec/s: 31 rss: 75Mb L: 16/34 MS: 1 PersAutoDict- DE: "\023x\010=\177\226\212\000"- 00:07:25.236 #63 DONE cov: 12439 ft: 15713 corp: 36/696b lim: 35 exec/s: 31 rss: 75Mb 00:07:25.236 ###### Recommended dictionary. ###### 00:07:25.236 "\005\000\000\000\000\000\000\000" # Uses: 2 00:07:25.236 "U\243\360\357~\226\212\000" # Uses: 0 00:07:25.236 "\023x\010=\177\226\212\000" # Uses: 1 00:07:25.236 ###### End of recommended dictionary. ###### 00:07:25.236 Done 63 runs in 2 second(s) 00:07:25.236 08:19:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_4.conf /var/tmp/suppress_nvmf_fuzz 00:07:25.236 08:19:38 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:25.236 08:19:38 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:25.236 08:19:38 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:25.236 08:19:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:25.236 08:19:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:25.236 08:19:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:25.236 08:19:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:25.236 08:19:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:25.236 08:19:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:25.236 08:19:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:25.236 08:19:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 5 00:07:25.236 08:19:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4405 00:07:25.236 08:19:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:25.236 08:19:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:25.236 08:19:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:25.236 08:19:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:25.236 08:19:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:25.236 08:19:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 00:07:25.236 [2024-11-17 08:19:38.263554] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:25.236 [2024-11-17 08:19:38.263616] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid991630 ] 00:07:25.496 [2024-11-17 08:19:38.442087] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.496 [2024-11-17 08:19:38.463822] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.496 [2024-11-17 08:19:38.516147] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:25.496 [2024-11-17 08:19:38.532477] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:25.496 INFO: Running with entropic power schedule (0xFF, 100). 00:07:25.496 INFO: Seed: 4051560172 00:07:25.496 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:25.496 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:25.496 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:25.496 INFO: A corpus is not provided, starting from an empty corpus 00:07:25.496 #2 INITED exec/s: 0 rss: 65Mb 00:07:25.496 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:25.496 This may also happen if the target rejected all inputs we tried so far 00:07:25.496 [2024-11-17 08:19:38.608896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.496 [2024-11-17 08:19:38.608932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.496 [2024-11-17 08:19:38.609036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.496 [2024-11-17 08:19:38.609054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.022 NEW_FUNC[1/715]: 0x461978 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:26.022 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:26.022 #3 NEW cov: 12223 ft: 12224 corp: 2/27b lim: 45 exec/s: 0 rss: 72Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:07:26.022 [2024-11-17 08:19:38.940281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.022 [2024-11-17 08:19:38.940317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.022 [2024-11-17 08:19:38.940440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.022 [2024-11-17 08:19:38.940457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.022 [2024-11-17 08:19:38.940584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.022 [2024-11-17 08:19:38.940604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.022 #4 NEW cov: 12336 ft: 13211 corp: 3/54b lim: 45 exec/s: 0 rss: 72Mb L: 27/27 MS: 1 CrossOver- 00:07:26.022 [2024-11-17 08:19:38.991046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.022 [2024-11-17 08:19:38.991075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.022 [2024-11-17 08:19:38.991203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.022 [2024-11-17 08:19:38.991219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.023 [2024-11-17 08:19:38.991335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.023 [2024-11-17 08:19:38.991353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.023 [2024-11-17 08:19:38.991474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.023 [2024-11-17 08:19:38.991492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.023 [2024-11-17 08:19:38.991609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.023 [2024-11-17 08:19:38.991626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:26.023 #5 NEW cov: 12342 ft: 13858 corp: 4/99b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:07:26.023 [2024-11-17 08:19:39.061312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.023 [2024-11-17 08:19:39.061341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.023 [2024-11-17 08:19:39.061464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.023 [2024-11-17 08:19:39.061484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.023 [2024-11-17 08:19:39.061608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.023 [2024-11-17 08:19:39.061626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.023 [2024-11-17 08:19:39.061758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.023 [2024-11-17 08:19:39.061773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.023 [2024-11-17 08:19:39.061891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.023 [2024-11-17 08:19:39.061909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:26.023 #6 NEW cov: 12427 ft: 14095 corp: 5/144b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 ShuffleBytes- 00:07:26.023 [2024-11-17 08:19:39.130880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.023 [2024-11-17 08:19:39.130908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.023 [2024-11-17 08:19:39.131047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.023 [2024-11-17 08:19:39.131065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.023 [2024-11-17 08:19:39.131187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.023 [2024-11-17 08:19:39.131203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.023 #7 NEW cov: 12427 ft: 14231 corp: 6/172b lim: 45 exec/s: 0 rss: 72Mb L: 28/45 MS: 1 CrossOver- 00:07:26.292 [2024-11-17 08:19:39.181633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.292 [2024-11-17 08:19:39.181662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.292 [2024-11-17 08:19:39.181789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.292 [2024-11-17 08:19:39.181806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.292 [2024-11-17 08:19:39.181935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.292 [2024-11-17 08:19:39.181950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.292 [2024-11-17 08:19:39.182078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.292 [2024-11-17 08:19:39.182095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.292 [2024-11-17 08:19:39.182213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.292 [2024-11-17 08:19:39.182231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:26.293 #8 NEW cov: 12427 ft: 14304 corp: 7/217b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 ShuffleBytes- 00:07:26.293 [2024-11-17 08:19:39.251177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.293 [2024-11-17 08:19:39.251203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.293 [2024-11-17 08:19:39.251346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.293 [2024-11-17 08:19:39.251365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.293 [2024-11-17 08:19:39.251488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.293 [2024-11-17 08:19:39.251505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.293 #9 NEW cov: 12427 ft: 14400 corp: 8/251b lim: 45 exec/s: 0 rss: 72Mb L: 34/45 MS: 1 EraseBytes- 00:07:26.293 [2024-11-17 08:19:39.301949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.293 [2024-11-17 08:19:39.301976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.293 [2024-11-17 08:19:39.302101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.293 [2024-11-17 08:19:39.302116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.293 [2024-11-17 08:19:39.302239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.293 [2024-11-17 08:19:39.302254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.293 [2024-11-17 08:19:39.302377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.294 [2024-11-17 08:19:39.302393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.294 [2024-11-17 08:19:39.302518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.294 [2024-11-17 08:19:39.302534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:26.294 #10 NEW cov: 12427 ft: 14438 corp: 9/296b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 ChangeByte- 00:07:26.294 [2024-11-17 08:19:39.350936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.294 [2024-11-17 08:19:39.350963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.294 #11 NEW cov: 12427 ft: 15182 corp: 10/313b lim: 45 exec/s: 0 rss: 72Mb L: 17/45 MS: 1 EraseBytes- 00:07:26.294 [2024-11-17 08:19:39.422101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e9e9260a cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.294 [2024-11-17 08:19:39.422127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.294 [2024-11-17 08:19:39.422256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.294 [2024-11-17 08:19:39.422274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.294 [2024-11-17 08:19:39.422397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.294 [2024-11-17 08:19:39.422414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.294 [2024-11-17 08:19:39.422534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.295 [2024-11-17 08:19:39.422549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.558 #14 NEW cov: 12427 ft: 15232 corp: 11/349b lim: 45 exec/s: 0 rss: 72Mb L: 36/45 MS: 3 InsertByte-ChangeBit-InsertRepeatedBytes- 00:07:26.558 [2024-11-17 08:19:39.471929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e9e9260a cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.558 [2024-11-17 08:19:39.471956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.558 [2024-11-17 08:19:39.472097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.558 [2024-11-17 08:19:39.472114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.558 [2024-11-17 08:19:39.472240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.558 [2024-11-17 08:19:39.472257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.558 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:26.558 #15 NEW cov: 12450 ft: 15350 corp: 12/378b lim: 45 exec/s: 0 rss: 73Mb L: 29/45 MS: 1 EraseBytes- 00:07:26.558 [2024-11-17 08:19:39.542768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.558 [2024-11-17 08:19:39.542796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.558 [2024-11-17 08:19:39.542917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.558 [2024-11-17 08:19:39.542937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.558 [2024-11-17 08:19:39.543064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.558 [2024-11-17 08:19:39.543082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.558 [2024-11-17 08:19:39.543206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:2affffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.558 [2024-11-17 08:19:39.543220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.558 [2024-11-17 08:19:39.543344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.558 [2024-11-17 08:19:39.543361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:26.558 #16 NEW cov: 12450 ft: 15363 corp: 13/423b lim: 45 exec/s: 0 rss: 73Mb L: 45/45 MS: 1 ChangeByte- 00:07:26.558 [2024-11-17 08:19:39.591713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.558 [2024-11-17 08:19:39.591742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.558 #17 NEW cov: 12450 ft: 15377 corp: 14/440b lim: 45 exec/s: 17 rss: 73Mb L: 17/45 MS: 1 CrossOver- 00:07:26.558 [2024-11-17 08:19:39.662768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e9e9260a cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.558 [2024-11-17 08:19:39.662793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.558 [2024-11-17 08:19:39.662919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.558 [2024-11-17 08:19:39.662936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.558 [2024-11-17 08:19:39.663057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.558 [2024-11-17 08:19:39.663072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.558 [2024-11-17 08:19:39.663194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.558 [2024-11-17 08:19:39.663210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.558 #18 NEW cov: 12450 ft: 15390 corp: 15/476b lim: 45 exec/s: 18 rss: 73Mb L: 36/45 MS: 1 ChangeBinInt- 00:07:26.817 [2024-11-17 08:19:39.712068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.817 [2024-11-17 08:19:39.712096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.817 #19 NEW cov: 12450 ft: 15493 corp: 16/493b lim: 45 exec/s: 19 rss: 73Mb L: 17/45 MS: 1 ChangeBinInt- 00:07:26.817 [2024-11-17 08:19:39.762574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.817 [2024-11-17 08:19:39.762603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.817 [2024-11-17 08:19:39.762734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:24ffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.817 [2024-11-17 08:19:39.762751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.817 #20 NEW cov: 12450 ft: 15506 corp: 17/519b lim: 45 exec/s: 20 rss: 73Mb L: 26/45 MS: 1 ChangeByte- 00:07:26.817 [2024-11-17 08:19:39.832434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.817 [2024-11-17 08:19:39.832459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.817 #21 NEW cov: 12450 ft: 15526 corp: 18/535b lim: 45 exec/s: 21 rss: 73Mb L: 16/45 MS: 1 EraseBytes- 00:07:26.817 [2024-11-17 08:19:39.883864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.818 [2024-11-17 08:19:39.883891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.818 [2024-11-17 08:19:39.884012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.818 [2024-11-17 08:19:39.884028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.818 [2024-11-17 08:19:39.884148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.818 [2024-11-17 08:19:39.884163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.818 [2024-11-17 08:19:39.884285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fff90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.818 [2024-11-17 08:19:39.884301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.818 [2024-11-17 08:19:39.884413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.818 [2024-11-17 08:19:39.884430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:26.818 #22 NEW cov: 12450 ft: 15546 corp: 19/580b lim: 45 exec/s: 22 rss: 73Mb L: 45/45 MS: 1 ChangeBinInt- 00:07:26.818 [2024-11-17 08:19:39.933120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:76520003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.818 [2024-11-17 08:19:39.933145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.818 [2024-11-17 08:19:39.933264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00ff968a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.818 [2024-11-17 08:19:39.933279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.077 #23 NEW cov: 12450 ft: 15566 corp: 20/604b lim: 45 exec/s: 23 rss: 73Mb L: 24/45 MS: 1 CMP- DE: "vR}\244\205\226\212\000"- 00:07:27.077 [2024-11-17 08:19:40.004185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.077 [2024-11-17 08:19:40.004223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.077 [2024-11-17 08:19:40.004364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.077 [2024-11-17 08:19:40.004396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.077 [2024-11-17 08:19:40.004534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7652ffff cdw11:7da40004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.077 [2024-11-17 08:19:40.004559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.077 [2024-11-17 08:19:40.004687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.077 [2024-11-17 08:19:40.004715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.077 [2024-11-17 08:19:40.004851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.077 [2024-11-17 08:19:40.004872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:27.077 #24 NEW cov: 12450 ft: 15596 corp: 21/649b lim: 45 exec/s: 24 rss: 73Mb L: 45/45 MS: 1 PersAutoDict- DE: "vR}\244\205\226\212\000"- 00:07:27.077 [2024-11-17 08:19:40.054139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e9e9260a cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.077 [2024-11-17 08:19:40.054166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.077 [2024-11-17 08:19:40.054291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.077 [2024-11-17 08:19:40.054309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.077 [2024-11-17 08:19:40.054429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.077 [2024-11-17 08:19:40.054445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.077 [2024-11-17 08:19:40.054562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.077 [2024-11-17 08:19:40.054577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.077 #25 NEW cov: 12450 ft: 15607 corp: 22/685b lim: 45 exec/s: 25 rss: 73Mb L: 36/45 MS: 1 CopyPart- 00:07:27.077 [2024-11-17 08:19:40.104284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e9e9260a cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.077 [2024-11-17 08:19:40.104314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.077 [2024-11-17 08:19:40.104454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1a1a1a1a cdw11:1ae90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.077 [2024-11-17 08:19:40.104472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.077 [2024-11-17 08:19:40.104593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.077 [2024-11-17 08:19:40.104610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.077 [2024-11-17 08:19:40.104731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.077 [2024-11-17 08:19:40.104748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.077 #26 NEW cov: 12450 ft: 15620 corp: 23/728b lim: 45 exec/s: 26 rss: 73Mb L: 43/45 MS: 1 InsertRepeatedBytes- 00:07:27.077 [2024-11-17 08:19:40.154737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.077 [2024-11-17 08:19:40.154765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.077 [2024-11-17 08:19:40.154888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.077 [2024-11-17 08:19:40.154905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.077 [2024-11-17 08:19:40.155029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.077 [2024-11-17 08:19:40.155045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.077 [2024-11-17 08:19:40.155168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:2affffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.077 [2024-11-17 08:19:40.155184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.077 [2024-11-17 08:19:40.155297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.077 [2024-11-17 08:19:40.155314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:27.077 #27 NEW cov: 12450 ft: 15641 corp: 24/773b lim: 45 exec/s: 27 rss: 73Mb L: 45/45 MS: 1 CrossOver- 00:07:27.337 [2024-11-17 08:19:40.224729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.224757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.337 [2024-11-17 08:19:40.224881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.224898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.337 [2024-11-17 08:19:40.225024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.225040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.337 [2024-11-17 08:19:40.225165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.225182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.337 #28 NEW cov: 12450 ft: 15661 corp: 25/814b lim: 45 exec/s: 28 rss: 73Mb L: 41/45 MS: 1 EraseBytes- 00:07:27.337 [2024-11-17 08:19:40.294801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.294827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.337 [2024-11-17 08:19:40.294951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.294970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.337 [2024-11-17 08:19:40.295095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.295112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.337 [2024-11-17 08:19:40.295224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.295242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.337 #29 NEW cov: 12450 ft: 15725 corp: 26/854b lim: 45 exec/s: 29 rss: 73Mb L: 40/45 MS: 1 EraseBytes- 00:07:27.337 [2024-11-17 08:19:40.344896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e9e9260a cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.344924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.337 [2024-11-17 08:19:40.345047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1a1a1a1a cdw11:1ae90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.345064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.337 [2024-11-17 08:19:40.345169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.345186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.337 [2024-11-17 08:19:40.345307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.345324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.337 #30 NEW cov: 12450 ft: 15733 corp: 27/898b lim: 45 exec/s: 30 rss: 73Mb L: 44/45 MS: 1 CopyPart- 00:07:27.337 [2024-11-17 08:19:40.415278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e9e9260a cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.415306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.337 [2024-11-17 08:19:40.415424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.415456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.337 [2024-11-17 08:19:40.415575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e9ffe9e9 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.415592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.337 [2024-11-17 08:19:40.415708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.415726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.337 #31 NEW cov: 12450 ft: 15739 corp: 28/937b lim: 45 exec/s: 31 rss: 73Mb L: 39/45 MS: 1 InsertRepeatedBytes- 00:07:27.337 [2024-11-17 08:19:40.465741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.465768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.337 [2024-11-17 08:19:40.465891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.465907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.337 [2024-11-17 08:19:40.466025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:85967da4 cdw11:8a000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.466043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.337 [2024-11-17 08:19:40.466161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:2affffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.466178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.337 [2024-11-17 08:19:40.466303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.337 [2024-11-17 08:19:40.466320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:27.596 #32 NEW cov: 12450 ft: 15769 corp: 29/982b lim: 45 exec/s: 32 rss: 73Mb L: 45/45 MS: 1 PersAutoDict- DE: "vR}\244\205\226\212\000"- 00:07:27.596 [2024-11-17 08:19:40.515244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:0aff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.596 [2024-11-17 08:19:40.515274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.596 [2024-11-17 08:19:40.515409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:0aff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.596 [2024-11-17 08:19:40.515426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.596 [2024-11-17 08:19:40.515543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.596 [2024-11-17 08:19:40.515559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.596 #33 NEW cov: 12450 ft: 15785 corp: 30/1010b lim: 45 exec/s: 33 rss: 73Mb L: 28/45 MS: 1 CrossOver- 00:07:27.596 [2024-11-17 08:19:40.565777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.596 [2024-11-17 08:19:40.565806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.596 [2024-11-17 08:19:40.565934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.596 [2024-11-17 08:19:40.565951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.596 [2024-11-17 08:19:40.566069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.596 [2024-11-17 08:19:40.566085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.596 [2024-11-17 08:19:40.566207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00110000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.596 [2024-11-17 08:19:40.566224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.596 #34 NEW cov: 12450 ft: 15818 corp: 31/1051b lim: 45 exec/s: 17 rss: 73Mb L: 41/45 MS: 1 CrossOver- 00:07:27.596 #34 DONE cov: 12450 ft: 15818 corp: 31/1051b lim: 45 exec/s: 17 rss: 73Mb 00:07:27.596 ###### Recommended dictionary. ###### 00:07:27.596 "vR}\244\205\226\212\000" # Uses: 2 00:07:27.596 ###### End of recommended dictionary. ###### 00:07:27.596 Done 34 runs in 2 second(s) 00:07:27.596 08:19:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_5.conf /var/tmp/suppress_nvmf_fuzz 00:07:27.596 08:19:40 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:27.596 08:19:40 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:27.596 08:19:40 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:27.596 08:19:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:27.596 08:19:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:27.596 08:19:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:27.596 08:19:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:27.596 08:19:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:27.596 08:19:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:27.596 08:19:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:27.596 08:19:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 6 00:07:27.596 08:19:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4406 00:07:27.596 08:19:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:27.596 08:19:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:27.596 08:19:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:27.855 08:19:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:27.855 08:19:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:27.855 08:19:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 00:07:27.855 [2024-11-17 08:19:40.763001] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:27.855 [2024-11-17 08:19:40.763076] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid992134 ] 00:07:27.855 [2024-11-17 08:19:40.941598] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:27.855 [2024-11-17 08:19:40.963655] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.114 [2024-11-17 08:19:41.016301] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:28.114 [2024-11-17 08:19:41.032577] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:28.114 INFO: Running with entropic power schedule (0xFF, 100). 00:07:28.114 INFO: Seed: 2256590976 00:07:28.114 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:28.114 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:28.114 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:28.114 INFO: A corpus is not provided, starting from an empty corpus 00:07:28.114 #2 INITED exec/s: 0 rss: 66Mb 00:07:28.114 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:28.114 This may also happen if the target rejected all inputs we tried so far 00:07:28.114 [2024-11-17 08:19:41.077375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:28.114 [2024-11-17 08:19:41.077414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.375 NEW_FUNC[1/713]: 0x464188 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:28.375 NEW_FUNC[2/713]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:28.375 #3 NEW cov: 12140 ft: 12126 corp: 2/3b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 CopyPart- 00:07:28.375 [2024-11-17 08:19:41.460037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000230a cdw11:00000000 00:07:28.375 [2024-11-17 08:19:41.460087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.375 #4 NEW cov: 12253 ft: 12942 corp: 3/5b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 ChangeByte- 00:07:28.718 [2024-11-17 08:19:41.530141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000232f cdw11:00000000 00:07:28.718 [2024-11-17 08:19:41.530172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.718 #5 NEW cov: 12259 ft: 13196 corp: 4/7b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 ChangeByte- 00:07:28.718 [2024-11-17 08:19:41.600408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a23 cdw11:00000000 00:07:28.718 [2024-11-17 08:19:41.600437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.718 #6 NEW cov: 12344 ft: 13416 corp: 5/9b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 ShuffleBytes- 00:07:28.718 [2024-11-17 08:19:41.650791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000290a cdw11:00000000 00:07:28.718 [2024-11-17 08:19:41.650819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.718 #11 NEW cov: 12344 ft: 13538 corp: 6/12b lim: 10 exec/s: 0 rss: 72Mb L: 3/3 MS: 5 ChangeByte-ShuffleBytes-CopyPart-ShuffleBytes-CrossOver- 00:07:28.718 [2024-11-17 08:19:41.700943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000280a cdw11:00000000 00:07:28.718 [2024-11-17 08:19:41.700972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.718 #13 NEW cov: 12344 ft: 13578 corp: 7/14b lim: 10 exec/s: 0 rss: 72Mb L: 2/3 MS: 2 EraseBytes-InsertByte- 00:07:28.718 [2024-11-17 08:19:41.771198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001c0a cdw11:00000000 00:07:28.718 [2024-11-17 08:19:41.771227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.718 #14 NEW cov: 12344 ft: 13640 corp: 8/17b lim: 10 exec/s: 0 rss: 72Mb L: 3/3 MS: 1 InsertByte- 00:07:28.718 [2024-11-17 08:19:41.821506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003f8a cdw11:00000000 00:07:28.718 [2024-11-17 08:19:41.821534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.000 #17 NEW cov: 12344 ft: 13661 corp: 9/19b lim: 10 exec/s: 0 rss: 72Mb L: 2/3 MS: 3 ChangeBit-ShuffleBytes-InsertByte- 00:07:29.000 [2024-11-17 08:19:41.871499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ae6 cdw11:00000000 00:07:29.000 [2024-11-17 08:19:41.871527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.000 #18 NEW cov: 12344 ft: 13711 corp: 10/21b lim: 10 exec/s: 0 rss: 72Mb L: 2/3 MS: 1 ChangeByte- 00:07:29.000 [2024-11-17 08:19:41.922268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001c1c cdw11:00000000 00:07:29.000 [2024-11-17 08:19:41.922301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.000 [2024-11-17 08:19:41.922426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a23 cdw11:00000000 00:07:29.000 [2024-11-17 08:19:41.922443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.000 [2024-11-17 08:19:41.922565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a23 cdw11:00000000 00:07:29.000 [2024-11-17 08:19:41.922582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.000 #19 NEW cov: 12344 ft: 14001 corp: 11/27b lim: 10 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 CopyPart- 00:07:29.000 [2024-11-17 08:19:41.992496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:29.000 [2024-11-17 08:19:41.992524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.000 [2024-11-17 08:19:41.992651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff1c cdw11:00000000 00:07:29.000 [2024-11-17 08:19:41.992668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.000 [2024-11-17 08:19:41.992793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a23 cdw11:00000000 00:07:29.000 [2024-11-17 08:19:41.992810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.000 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:29.000 #20 NEW cov: 12361 ft: 14126 corp: 12/33b lim: 10 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:07:29.000 [2024-11-17 08:19:42.042649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001c1c cdw11:00000000 00:07:29.000 [2024-11-17 08:19:42.042678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.000 [2024-11-17 08:19:42.042822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a23 cdw11:00000000 00:07:29.000 [2024-11-17 08:19:42.042840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.000 [2024-11-17 08:19:42.042968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00008a23 cdw11:00000000 00:07:29.000 [2024-11-17 08:19:42.042984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.000 #21 NEW cov: 12361 ft: 14156 corp: 13/39b lim: 10 exec/s: 21 rss: 72Mb L: 6/6 MS: 1 ChangeBit- 00:07:29.000 [2024-11-17 08:19:42.112396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000230a cdw11:00000000 00:07:29.000 [2024-11-17 08:19:42.112422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.259 #22 NEW cov: 12361 ft: 14169 corp: 14/41b lim: 10 exec/s: 22 rss: 73Mb L: 2/6 MS: 1 CrossOver- 00:07:29.259 [2024-11-17 08:19:42.182705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002902 cdw11:00000000 00:07:29.259 [2024-11-17 08:19:42.182733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.259 #23 NEW cov: 12361 ft: 14215 corp: 15/44b lim: 10 exec/s: 23 rss: 73Mb L: 3/6 MS: 1 ChangeBit- 00:07:29.259 [2024-11-17 08:19:42.252903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003fff cdw11:00000000 00:07:29.259 [2024-11-17 08:19:42.252931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.259 #24 NEW cov: 12361 ft: 14245 corp: 16/47b lim: 10 exec/s: 24 rss: 73Mb L: 3/6 MS: 1 CrossOver- 00:07:29.259 [2024-11-17 08:19:42.323097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002828 cdw11:00000000 00:07:29.259 [2024-11-17 08:19:42.323125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.259 #25 NEW cov: 12361 ft: 14269 corp: 17/50b lim: 10 exec/s: 25 rss: 73Mb L: 3/6 MS: 1 CopyPart- 00:07:29.259 [2024-11-17 08:19:42.373379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a23 cdw11:00000000 00:07:29.259 [2024-11-17 08:19:42.373408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.259 #26 NEW cov: 12361 ft: 14278 corp: 18/52b lim: 10 exec/s: 26 rss: 73Mb L: 2/6 MS: 1 CopyPart- 00:07:29.517 [2024-11-17 08:19:42.423443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000280a cdw11:00000000 00:07:29.517 [2024-11-17 08:19:42.423470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.517 #27 NEW cov: 12361 ft: 14301 corp: 19/54b lim: 10 exec/s: 27 rss: 73Mb L: 2/6 MS: 1 ShuffleBytes- 00:07:29.517 [2024-11-17 08:19:42.473598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000b0b cdw11:00000000 00:07:29.517 [2024-11-17 08:19:42.473624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.517 #30 NEW cov: 12361 ft: 14318 corp: 20/56b lim: 10 exec/s: 30 rss: 73Mb L: 2/6 MS: 3 EraseBytes-ChangeBit-CopyPart- 00:07:29.517 [2024-11-17 08:19:42.523790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009bff cdw11:00000000 00:07:29.517 [2024-11-17 08:19:42.523818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.517 #31 NEW cov: 12361 ft: 14319 corp: 21/59b lim: 10 exec/s: 31 rss: 73Mb L: 3/6 MS: 1 ChangeByte- 00:07:29.517 [2024-11-17 08:19:42.594306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009bff cdw11:00000000 00:07:29.517 [2024-11-17 08:19:42.594334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.517 [2024-11-17 08:19:42.594478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00003aff cdw11:00000000 00:07:29.517 [2024-11-17 08:19:42.594497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.517 #32 NEW cov: 12361 ft: 14513 corp: 22/63b lim: 10 exec/s: 32 rss: 73Mb L: 4/6 MS: 1 InsertByte- 00:07:29.776 [2024-11-17 08:19:42.664315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009bff cdw11:00000000 00:07:29.776 [2024-11-17 08:19:42.664345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.776 #33 NEW cov: 12361 ft: 14545 corp: 23/66b lim: 10 exec/s: 33 rss: 73Mb L: 3/6 MS: 1 CopyPart- 00:07:29.776 [2024-11-17 08:19:42.714539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009bff cdw11:00000000 00:07:29.777 [2024-11-17 08:19:42.714566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.777 #34 NEW cov: 12361 ft: 14553 corp: 24/69b lim: 10 exec/s: 34 rss: 73Mb L: 3/6 MS: 1 CopyPart- 00:07:29.777 [2024-11-17 08:19:42.785497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002901 cdw11:00000000 00:07:29.777 [2024-11-17 08:19:42.785525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.777 [2024-11-17 08:19:42.785659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.777 [2024-11-17 08:19:42.785677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.777 [2024-11-17 08:19:42.785812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.777 [2024-11-17 08:19:42.785829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.777 [2024-11-17 08:19:42.785956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.777 [2024-11-17 08:19:42.785976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.777 #36 NEW cov: 12361 ft: 14780 corp: 25/78b lim: 10 exec/s: 36 rss: 73Mb L: 9/9 MS: 2 CrossOver-CMP- DE: "\001\000\000\000\000\000\000\020"- 00:07:29.777 [2024-11-17 08:19:42.835076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000282f cdw11:00000000 00:07:29.777 [2024-11-17 08:19:42.835104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.777 #37 NEW cov: 12361 ft: 14865 corp: 26/81b lim: 10 exec/s: 37 rss: 73Mb L: 3/9 MS: 1 InsertByte- 00:07:29.777 [2024-11-17 08:19:42.905588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001e00 cdw11:00000000 00:07:29.777 [2024-11-17 08:19:42.905616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.777 [2024-11-17 08:19:42.905752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.777 [2024-11-17 08:19:42.905770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.036 #39 NEW cov: 12361 ft: 14880 corp: 27/86b lim: 10 exec/s: 39 rss: 73Mb L: 5/9 MS: 2 EraseBytes-CMP- DE: "\036\000\000\000"- 00:07:30.036 [2024-11-17 08:19:42.955925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007eff cdw11:00000000 00:07:30.036 [2024-11-17 08:19:42.955956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.036 [2024-11-17 08:19:42.956100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00003aff cdw11:00000000 00:07:30.036 [2024-11-17 08:19:42.956119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.036 #40 NEW cov: 12368 ft: 14892 corp: 28/90b lim: 10 exec/s: 40 rss: 73Mb L: 4/9 MS: 1 ChangeByte- 00:07:30.036 [2024-11-17 08:19:43.025975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009bff cdw11:00000000 00:07:30.036 [2024-11-17 08:19:43.026004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.036 #41 NEW cov: 12368 ft: 14901 corp: 29/92b lim: 10 exec/s: 41 rss: 73Mb L: 2/9 MS: 1 EraseBytes- 00:07:30.036 [2024-11-17 08:19:43.076437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000b0b cdw11:00000000 00:07:30.036 [2024-11-17 08:19:43.076464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.036 [2024-11-17 08:19:43.076611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000b0b cdw11:00000000 00:07:30.036 [2024-11-17 08:19:43.076628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.036 #42 NEW cov: 12368 ft: 14910 corp: 30/96b lim: 10 exec/s: 21 rss: 73Mb L: 4/9 MS: 1 CopyPart- 00:07:30.036 #42 DONE cov: 12368 ft: 14910 corp: 30/96b lim: 10 exec/s: 21 rss: 73Mb 00:07:30.036 ###### Recommended dictionary. ###### 00:07:30.036 "\001\000\000\000\000\000\000\020" # Uses: 0 00:07:30.036 "\036\000\000\000" # Uses: 0 00:07:30.036 ###### End of recommended dictionary. ###### 00:07:30.036 Done 42 runs in 2 second(s) 00:07:30.296 08:19:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_6.conf /var/tmp/suppress_nvmf_fuzz 00:07:30.296 08:19:43 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:30.296 08:19:43 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:30.296 08:19:43 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:30.296 08:19:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:30.296 08:19:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:30.296 08:19:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:30.296 08:19:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:30.296 08:19:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:30.296 08:19:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:30.296 08:19:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:30.296 08:19:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 7 00:07:30.296 08:19:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4407 00:07:30.296 08:19:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:30.296 08:19:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:30.296 08:19:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:30.296 08:19:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:30.296 08:19:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:30.296 08:19:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 00:07:30.296 [2024-11-17 08:19:43.270507] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:30.296 [2024-11-17 08:19:43.270577] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid992665 ] 00:07:30.555 [2024-11-17 08:19:43.447389] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.555 [2024-11-17 08:19:43.469044] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.555 [2024-11-17 08:19:43.521231] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:30.555 [2024-11-17 08:19:43.537517] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:30.555 INFO: Running with entropic power schedule (0xFF, 100). 00:07:30.555 INFO: Seed: 466625888 00:07:30.555 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:30.555 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:30.555 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:30.555 INFO: A corpus is not provided, starting from an empty corpus 00:07:30.555 #2 INITED exec/s: 0 rss: 65Mb 00:07:30.555 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:30.555 This may also happen if the target rejected all inputs we tried so far 00:07:30.555 [2024-11-17 08:19:43.592927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:30.555 [2024-11-17 08:19:43.592958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.814 NEW_FUNC[1/713]: 0x464b88 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:30.814 NEW_FUNC[2/713]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:30.814 #3 NEW cov: 12140 ft: 12138 corp: 2/3b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 CrossOver- 00:07:30.814 [2024-11-17 08:19:43.903664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:30.814 [2024-11-17 08:19:43.903708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.814 #4 NEW cov: 12253 ft: 12719 corp: 3/5b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 CopyPart- 00:07:31.073 [2024-11-17 08:19:43.963751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:31.073 [2024-11-17 08:19:43.963778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.073 #5 NEW cov: 12259 ft: 13007 corp: 4/8b lim: 10 exec/s: 0 rss: 72Mb L: 3/3 MS: 1 CopyPart- 00:07:31.073 [2024-11-17 08:19:44.003850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a3d cdw11:00000000 00:07:31.073 [2024-11-17 08:19:44.003876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.073 #8 NEW cov: 12344 ft: 13238 corp: 5/10b lim: 10 exec/s: 0 rss: 72Mb L: 2/3 MS: 3 ShuffleBytes-CrossOver-InsertByte- 00:07:31.073 [2024-11-17 08:19:44.044084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:31.073 [2024-11-17 08:19:44.044111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.073 [2024-11-17 08:19:44.044159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:31.073 [2024-11-17 08:19:44.044173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.073 #10 NEW cov: 12344 ft: 13482 corp: 6/14b lim: 10 exec/s: 0 rss: 72Mb L: 4/4 MS: 2 CrossOver-CrossOver- 00:07:31.073 [2024-11-17 08:19:44.084068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:31.073 [2024-11-17 08:19:44.084094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.073 #11 NEW cov: 12344 ft: 13553 corp: 7/16b lim: 10 exec/s: 0 rss: 72Mb L: 2/4 MS: 1 EraseBytes- 00:07:31.073 [2024-11-17 08:19:44.144261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000c83d cdw11:00000000 00:07:31.073 [2024-11-17 08:19:44.144288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.073 #13 NEW cov: 12344 ft: 13684 corp: 8/18b lim: 10 exec/s: 0 rss: 72Mb L: 2/4 MS: 2 EraseBytes-InsertByte- 00:07:31.073 [2024-11-17 08:19:44.204403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a3f cdw11:00000000 00:07:31.073 [2024-11-17 08:19:44.204429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.332 #14 NEW cov: 12344 ft: 13750 corp: 9/20b lim: 10 exec/s: 0 rss: 72Mb L: 2/4 MS: 1 InsertByte- 00:07:31.332 [2024-11-17 08:19:44.244523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003d0a cdw11:00000000 00:07:31.332 [2024-11-17 08:19:44.244548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.332 #15 NEW cov: 12344 ft: 13816 corp: 10/23b lim: 10 exec/s: 0 rss: 72Mb L: 3/4 MS: 1 CrossOver- 00:07:31.332 [2024-11-17 08:19:44.305142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:31.332 [2024-11-17 08:19:44.305168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.332 [2024-11-17 08:19:44.305214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:31.332 [2024-11-17 08:19:44.305227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.332 [2024-11-17 08:19:44.305274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:31.332 [2024-11-17 08:19:44.305287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.332 [2024-11-17 08:19:44.305334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:31.332 [2024-11-17 08:19:44.305346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.332 [2024-11-17 08:19:44.305393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:31.332 [2024-11-17 08:19:44.305405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:31.332 #16 NEW cov: 12344 ft: 14160 corp: 11/33b lim: 10 exec/s: 0 rss: 73Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:31.332 [2024-11-17 08:19:44.364891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000af1 cdw11:00000000 00:07:31.332 [2024-11-17 08:19:44.364918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.332 #17 NEW cov: 12344 ft: 14194 corp: 12/36b lim: 10 exec/s: 0 rss: 73Mb L: 3/10 MS: 1 ChangeBinInt- 00:07:31.332 [2024-11-17 08:19:44.404993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000f10a cdw11:00000000 00:07:31.332 [2024-11-17 08:19:44.405018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.332 #18 NEW cov: 12344 ft: 14231 corp: 13/38b lim: 10 exec/s: 0 rss: 73Mb L: 2/10 MS: 1 CrossOver- 00:07:31.332 [2024-11-17 08:19:44.445309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000c6c6 cdw11:00000000 00:07:31.332 [2024-11-17 08:19:44.445334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.332 [2024-11-17 08:19:44.445381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000c6c6 cdw11:00000000 00:07:31.332 [2024-11-17 08:19:44.445394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.332 [2024-11-17 08:19:44.445442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000c6c6 cdw11:00000000 00:07:31.332 [2024-11-17 08:19:44.445455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.591 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:31.591 #20 NEW cov: 12367 ft: 14409 corp: 14/45b lim: 10 exec/s: 0 rss: 73Mb L: 7/10 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:31.591 [2024-11-17 08:19:44.505508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000c6c6 cdw11:00000000 00:07:31.591 [2024-11-17 08:19:44.505533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.591 [2024-11-17 08:19:44.505582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000c2c6 cdw11:00000000 00:07:31.591 [2024-11-17 08:19:44.505598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.591 [2024-11-17 08:19:44.505642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000c6c6 cdw11:00000000 00:07:31.591 [2024-11-17 08:19:44.505655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.591 #21 NEW cov: 12367 ft: 14430 corp: 15/52b lim: 10 exec/s: 0 rss: 73Mb L: 7/10 MS: 1 ChangeBit- 00:07:31.591 [2024-11-17 08:19:44.565426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a0a cdw11:00000000 00:07:31.591 [2024-11-17 08:19:44.565452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.591 #22 NEW cov: 12367 ft: 14481 corp: 16/54b lim: 10 exec/s: 22 rss: 73Mb L: 2/10 MS: 1 ChangeBit- 00:07:31.591 [2024-11-17 08:19:44.625612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000023d cdw11:00000000 00:07:31.591 [2024-11-17 08:19:44.625638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.591 #23 NEW cov: 12367 ft: 14502 corp: 17/56b lim: 10 exec/s: 23 rss: 73Mb L: 2/10 MS: 1 ChangeBinInt- 00:07:31.591 [2024-11-17 08:19:44.665699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:31.591 [2024-11-17 08:19:44.665725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.591 #25 NEW cov: 12367 ft: 14520 corp: 18/58b lim: 10 exec/s: 25 rss: 73Mb L: 2/10 MS: 2 EraseBytes-CopyPart- 00:07:31.591 [2024-11-17 08:19:44.726015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a0a cdw11:00000000 00:07:31.591 [2024-11-17 08:19:44.726041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.591 [2024-11-17 08:19:44.726093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001a0a cdw11:00000000 00:07:31.591 [2024-11-17 08:19:44.726106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.851 #26 NEW cov: 12367 ft: 14543 corp: 19/62b lim: 10 exec/s: 26 rss: 73Mb L: 4/10 MS: 1 CopyPart- 00:07:31.851 [2024-11-17 08:19:44.786272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000cec6 cdw11:00000000 00:07:31.851 [2024-11-17 08:19:44.786297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.851 [2024-11-17 08:19:44.786347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000c2c6 cdw11:00000000 00:07:31.851 [2024-11-17 08:19:44.786360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.851 [2024-11-17 08:19:44.786407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000c6c6 cdw11:00000000 00:07:31.851 [2024-11-17 08:19:44.786420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.851 #27 NEW cov: 12367 ft: 14558 corp: 20/69b lim: 10 exec/s: 27 rss: 73Mb L: 7/10 MS: 1 ChangeBit- 00:07:31.851 [2024-11-17 08:19:44.846283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:31.851 [2024-11-17 08:19:44.846308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.851 [2024-11-17 08:19:44.846355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:31.851 [2024-11-17 08:19:44.846368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.851 #28 NEW cov: 12367 ft: 14600 corp: 21/73b lim: 10 exec/s: 28 rss: 73Mb L: 4/10 MS: 1 ShuffleBytes- 00:07:31.851 [2024-11-17 08:19:44.886289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003dc8 cdw11:00000000 00:07:31.851 [2024-11-17 08:19:44.886314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.851 #30 NEW cov: 12367 ft: 14632 corp: 22/76b lim: 10 exec/s: 30 rss: 73Mb L: 3/10 MS: 2 EraseBytes-CrossOver- 00:07:31.851 [2024-11-17 08:19:44.926426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000028f1 cdw11:00000000 00:07:31.851 [2024-11-17 08:19:44.926450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.851 #31 NEW cov: 12367 ft: 14656 corp: 23/79b lim: 10 exec/s: 31 rss: 73Mb L: 3/10 MS: 1 ChangeByte- 00:07:31.851 [2024-11-17 08:19:44.986599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0b cdw11:00000000 00:07:31.851 [2024-11-17 08:19:44.986624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.110 #32 NEW cov: 12367 ft: 14741 corp: 24/81b lim: 10 exec/s: 32 rss: 73Mb L: 2/10 MS: 1 ChangeBit- 00:07:32.110 [2024-11-17 08:19:45.026829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b5b5 cdw11:00000000 00:07:32.110 [2024-11-17 08:19:45.026854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.110 [2024-11-17 08:19:45.026902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b50a cdw11:00000000 00:07:32.110 [2024-11-17 08:19:45.026915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.110 #33 NEW cov: 12367 ft: 14753 corp: 25/85b lim: 10 exec/s: 33 rss: 73Mb L: 4/10 MS: 1 InsertRepeatedBytes- 00:07:32.110 [2024-11-17 08:19:45.066912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000c6c6 cdw11:00000000 00:07:32.110 [2024-11-17 08:19:45.066937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.110 [2024-11-17 08:19:45.066986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000c63d cdw11:00000000 00:07:32.110 [2024-11-17 08:19:45.066999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.110 #34 NEW cov: 12367 ft: 14765 corp: 26/89b lim: 10 exec/s: 34 rss: 73Mb L: 4/10 MS: 1 EraseBytes- 00:07:32.110 [2024-11-17 08:19:45.106922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:32.110 [2024-11-17 08:19:45.106948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.110 #35 NEW cov: 12367 ft: 14778 corp: 27/91b lim: 10 exec/s: 35 rss: 73Mb L: 2/10 MS: 1 EraseBytes- 00:07:32.110 [2024-11-17 08:19:45.147114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000002a7 cdw11:00000000 00:07:32.110 [2024-11-17 08:19:45.147139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.110 [2024-11-17 08:19:45.147188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a7a7 cdw11:00000000 00:07:32.110 [2024-11-17 08:19:45.147201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.110 #36 NEW cov: 12367 ft: 14835 corp: 28/96b lim: 10 exec/s: 36 rss: 74Mb L: 5/10 MS: 1 InsertRepeatedBytes- 00:07:32.110 [2024-11-17 08:19:45.207577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:32.110 [2024-11-17 08:19:45.207606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.110 [2024-11-17 08:19:45.207657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:32.110 [2024-11-17 08:19:45.207669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.110 [2024-11-17 08:19:45.207723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:32.110 [2024-11-17 08:19:45.207737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.110 [2024-11-17 08:19:45.207785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:32.110 [2024-11-17 08:19:45.207798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.110 [2024-11-17 08:19:45.207847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:32.110 [2024-11-17 08:19:45.207860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:32.369 #37 NEW cov: 12367 ft: 14847 corp: 29/106b lim: 10 exec/s: 37 rss: 74Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:32.369 [2024-11-17 08:19:45.267780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:32.369 [2024-11-17 08:19:45.267805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.369 [2024-11-17 08:19:45.267855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:32.369 [2024-11-17 08:19:45.267868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.370 [2024-11-17 08:19:45.267914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:32.370 [2024-11-17 08:19:45.267927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.370 [2024-11-17 08:19:45.267976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000aa1 cdw11:00000000 00:07:32.370 [2024-11-17 08:19:45.267989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.370 [2024-11-17 08:19:45.268036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:32.370 [2024-11-17 08:19:45.268050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:32.370 #38 NEW cov: 12367 ft: 14857 corp: 30/116b lim: 10 exec/s: 38 rss: 74Mb L: 10/10 MS: 1 CopyPart- 00:07:32.370 [2024-11-17 08:19:45.307430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003d0a cdw11:00000000 00:07:32.370 [2024-11-17 08:19:45.307455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.370 #39 NEW cov: 12367 ft: 14868 corp: 31/119b lim: 10 exec/s: 39 rss: 74Mb L: 3/10 MS: 1 ChangeByte- 00:07:32.370 [2024-11-17 08:19:45.348017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:32.370 [2024-11-17 08:19:45.348043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.370 [2024-11-17 08:19:45.348092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:32.370 [2024-11-17 08:19:45.348105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.370 [2024-11-17 08:19:45.348156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000a1a0 cdw11:00000000 00:07:32.370 [2024-11-17 08:19:45.348170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.370 [2024-11-17 08:19:45.348218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:32.370 [2024-11-17 08:19:45.348231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.370 [2024-11-17 08:19:45.348277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:32.370 [2024-11-17 08:19:45.348290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:32.370 #40 NEW cov: 12367 ft: 14922 corp: 32/129b lim: 10 exec/s: 40 rss: 74Mb L: 10/10 MS: 1 ChangeBit- 00:07:32.370 [2024-11-17 08:19:45.388172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:32.370 [2024-11-17 08:19:45.388198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.370 [2024-11-17 08:19:45.388247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:32.370 [2024-11-17 08:19:45.388260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.370 [2024-11-17 08:19:45.388309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:32.370 [2024-11-17 08:19:45.388322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.370 [2024-11-17 08:19:45.388370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:32.370 [2024-11-17 08:19:45.388383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.370 [2024-11-17 08:19:45.388431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:32.370 [2024-11-17 08:19:45.388445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:32.370 #41 NEW cov: 12367 ft: 14953 corp: 33/139b lim: 10 exec/s: 41 rss: 74Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:32.370 [2024-11-17 08:19:45.447870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003f0a cdw11:00000000 00:07:32.370 [2024-11-17 08:19:45.447896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.370 #42 NEW cov: 12367 ft: 14962 corp: 34/142b lim: 10 exec/s: 42 rss: 74Mb L: 3/10 MS: 1 CopyPart- 00:07:32.629 [2024-11-17 08:19:45.508276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000c6c6 cdw11:00000000 00:07:32.629 [2024-11-17 08:19:45.508301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.629 [2024-11-17 08:19:45.508352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000c63d cdw11:00000000 00:07:32.629 [2024-11-17 08:19:45.508366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.629 [2024-11-17 08:19:45.508414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000023d cdw11:00000000 00:07:32.629 [2024-11-17 08:19:45.508428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.629 #43 NEW cov: 12367 ft: 14973 corp: 35/148b lim: 10 exec/s: 43 rss: 74Mb L: 6/10 MS: 1 CrossOver- 00:07:32.629 [2024-11-17 08:19:45.568516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000c6c6 cdw11:00000000 00:07:32.629 [2024-11-17 08:19:45.568541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.629 [2024-11-17 08:19:45.568591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f5f5 cdw11:00000000 00:07:32.629 [2024-11-17 08:19:45.568604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.629 [2024-11-17 08:19:45.568653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000f5f5 cdw11:00000000 00:07:32.629 [2024-11-17 08:19:45.568666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.629 [2024-11-17 08:19:45.568719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000f5c6 cdw11:00000000 00:07:32.629 [2024-11-17 08:19:45.568733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.629 #44 NEW cov: 12367 ft: 15004 corp: 36/157b lim: 10 exec/s: 22 rss: 74Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:32.629 #44 DONE cov: 12367 ft: 15004 corp: 36/157b lim: 10 exec/s: 22 rss: 74Mb 00:07:32.629 Done 44 runs in 2 second(s) 00:07:32.629 08:19:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_7.conf /var/tmp/suppress_nvmf_fuzz 00:07:32.629 08:19:45 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:32.629 08:19:45 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:32.629 08:19:45 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:32.629 08:19:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:32.629 08:19:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:32.629 08:19:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:32.629 08:19:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:32.629 08:19:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:32.629 08:19:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:32.629 08:19:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:32.629 08:19:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 8 00:07:32.629 08:19:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4408 00:07:32.629 08:19:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:32.629 08:19:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:32.629 08:19:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:32.629 08:19:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:32.629 08:19:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:32.629 08:19:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 00:07:32.629 [2024-11-17 08:19:45.746936] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:32.629 [2024-11-17 08:19:45.747006] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid992959 ] 00:07:32.887 [2024-11-17 08:19:45.931759] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.887 [2024-11-17 08:19:45.953567] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.887 [2024-11-17 08:19:46.006043] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:32.887 [2024-11-17 08:19:46.022383] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:33.146 INFO: Running with entropic power schedule (0xFF, 100). 00:07:33.146 INFO: Seed: 2951622574 00:07:33.146 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:33.146 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:33.146 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:33.146 INFO: A corpus is not provided, starting from an empty corpus 00:07:33.146 [2024-11-17 08:19:46.088509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.146 [2024-11-17 08:19:46.088546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.146 #2 INITED cov: 12168 ft: 12155 corp: 1/1b exec/s: 0 rss: 70Mb 00:07:33.146 [2024-11-17 08:19:46.138614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.146 [2024-11-17 08:19:46.138643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.146 #3 NEW cov: 12281 ft: 12637 corp: 2/2b lim: 5 exec/s: 0 rss: 71Mb L: 1/1 MS: 1 ShuffleBytes- 00:07:33.146 [2024-11-17 08:19:46.209121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.146 [2024-11-17 08:19:46.209151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.146 [2024-11-17 08:19:46.209263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.146 [2024-11-17 08:19:46.209280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.146 #4 NEW cov: 12287 ft: 13522 corp: 3/4b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 InsertByte- 00:07:33.146 [2024-11-17 08:19:46.248928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.146 [2024-11-17 08:19:46.248956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.146 #5 NEW cov: 12372 ft: 13930 corp: 4/5b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 ChangeBit- 00:07:33.404 [2024-11-17 08:19:46.299067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.404 [2024-11-17 08:19:46.299097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.404 #6 NEW cov: 12372 ft: 14043 corp: 5/6b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 ChangeByte- 00:07:33.404 [2024-11-17 08:19:46.349208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.404 [2024-11-17 08:19:46.349235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.404 #7 NEW cov: 12372 ft: 14191 corp: 6/7b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 CopyPart- 00:07:33.404 [2024-11-17 08:19:46.419424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.404 [2024-11-17 08:19:46.419453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.404 #8 NEW cov: 12372 ft: 14236 corp: 7/8b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 ShuffleBytes- 00:07:33.404 [2024-11-17 08:19:46.479541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.404 [2024-11-17 08:19:46.479568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.404 #9 NEW cov: 12372 ft: 14409 corp: 8/9b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 ChangeByte- 00:07:33.404 [2024-11-17 08:19:46.520008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.404 [2024-11-17 08:19:46.520035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.404 [2024-11-17 08:19:46.520156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.404 [2024-11-17 08:19:46.520173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.663 #10 NEW cov: 12372 ft: 14444 corp: 9/11b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 InsertByte- 00:07:33.663 [2024-11-17 08:19:46.570369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.663 [2024-11-17 08:19:46.570395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.663 [2024-11-17 08:19:46.570512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.663 [2024-11-17 08:19:46.570544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.663 [2024-11-17 08:19:46.570654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.663 [2024-11-17 08:19:46.570670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.663 #11 NEW cov: 12372 ft: 14652 corp: 10/14b lim: 5 exec/s: 0 rss: 71Mb L: 3/3 MS: 1 CrossOver- 00:07:33.663 [2024-11-17 08:19:46.641174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.663 [2024-11-17 08:19:46.641198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.663 [2024-11-17 08:19:46.641313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.663 [2024-11-17 08:19:46.641328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.663 [2024-11-17 08:19:46.641450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.663 [2024-11-17 08:19:46.641466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.663 [2024-11-17 08:19:46.641585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.663 [2024-11-17 08:19:46.641601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.663 [2024-11-17 08:19:46.641728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.663 [2024-11-17 08:19:46.641745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:33.663 #12 NEW cov: 12372 ft: 15053 corp: 11/19b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:33.663 [2024-11-17 08:19:46.710420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.663 [2024-11-17 08:19:46.710447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.663 [2024-11-17 08:19:46.710568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.663 [2024-11-17 08:19:46.710584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.663 #13 NEW cov: 12372 ft: 15088 corp: 12/21b lim: 5 exec/s: 0 rss: 71Mb L: 2/5 MS: 1 ChangeBit- 00:07:33.663 [2024-11-17 08:19:46.780738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.663 [2024-11-17 08:19:46.780766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.663 [2024-11-17 08:19:46.780887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.663 [2024-11-17 08:19:46.780906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.923 #14 NEW cov: 12372 ft: 15139 corp: 13/23b lim: 5 exec/s: 0 rss: 71Mb L: 2/5 MS: 1 ChangeBit- 00:07:33.923 [2024-11-17 08:19:46.830837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.923 [2024-11-17 08:19:46.830865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.923 [2024-11-17 08:19:46.831002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.923 [2024-11-17 08:19:46.831019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.923 #15 NEW cov: 12372 ft: 15165 corp: 14/25b lim: 5 exec/s: 0 rss: 71Mb L: 2/5 MS: 1 CopyPart- 00:07:33.923 [2024-11-17 08:19:46.901857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.923 [2024-11-17 08:19:46.901884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.923 [2024-11-17 08:19:46.902017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.923 [2024-11-17 08:19:46.902034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.923 [2024-11-17 08:19:46.902153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.923 [2024-11-17 08:19:46.902170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.923 [2024-11-17 08:19:46.902291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.923 [2024-11-17 08:19:46.902308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.923 [2024-11-17 08:19:46.902431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.923 [2024-11-17 08:19:46.902446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:33.923 #16 NEW cov: 12372 ft: 15179 corp: 15/30b lim: 5 exec/s: 0 rss: 72Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:33.923 [2024-11-17 08:19:46.962092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.923 [2024-11-17 08:19:46.962120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.923 [2024-11-17 08:19:46.962253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.923 [2024-11-17 08:19:46.962269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.923 [2024-11-17 08:19:46.962397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.923 [2024-11-17 08:19:46.962414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.923 [2024-11-17 08:19:46.962532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.923 [2024-11-17 08:19:46.962548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.923 [2024-11-17 08:19:46.962664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.923 [2024-11-17 08:19:46.962680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:34.182 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:34.182 #17 NEW cov: 12395 ft: 15216 corp: 16/35b lim: 5 exec/s: 17 rss: 73Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:34.182 [2024-11-17 08:19:47.292486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.182 [2024-11-17 08:19:47.292521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.182 [2024-11-17 08:19:47.292654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.182 [2024-11-17 08:19:47.292673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.441 #18 NEW cov: 12395 ft: 15275 corp: 17/37b lim: 5 exec/s: 18 rss: 73Mb L: 2/5 MS: 1 CopyPart- 00:07:34.441 [2024-11-17 08:19:47.363570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.441 [2024-11-17 08:19:47.363600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.441 [2024-11-17 08:19:47.363751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.441 [2024-11-17 08:19:47.363768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.441 [2024-11-17 08:19:47.363915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.441 [2024-11-17 08:19:47.363935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.441 [2024-11-17 08:19:47.364067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.441 [2024-11-17 08:19:47.364085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.441 [2024-11-17 08:19:47.364225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.441 [2024-11-17 08:19:47.364242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:34.441 #19 NEW cov: 12395 ft: 15314 corp: 18/42b lim: 5 exec/s: 19 rss: 73Mb L: 5/5 MS: 1 ShuffleBytes- 00:07:34.441 [2024-11-17 08:19:47.432879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.441 [2024-11-17 08:19:47.432908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.441 [2024-11-17 08:19:47.433051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.441 [2024-11-17 08:19:47.433069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.441 #20 NEW cov: 12395 ft: 15352 corp: 19/44b lim: 5 exec/s: 20 rss: 73Mb L: 2/5 MS: 1 ChangeBinInt- 00:07:34.441 [2024-11-17 08:19:47.483043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.441 [2024-11-17 08:19:47.483072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.441 [2024-11-17 08:19:47.483227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.441 [2024-11-17 08:19:47.483245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.441 #21 NEW cov: 12395 ft: 15430 corp: 20/46b lim: 5 exec/s: 21 rss: 73Mb L: 2/5 MS: 1 CrossOver- 00:07:34.442 [2024-11-17 08:19:47.533275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.442 [2024-11-17 08:19:47.533305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.442 [2024-11-17 08:19:47.533455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.442 [2024-11-17 08:19:47.533472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.442 #22 NEW cov: 12395 ft: 15441 corp: 21/48b lim: 5 exec/s: 22 rss: 73Mb L: 2/5 MS: 1 CrossOver- 00:07:34.700 [2024-11-17 08:19:47.583701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.700 [2024-11-17 08:19:47.583731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.700 [2024-11-17 08:19:47.583879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.700 [2024-11-17 08:19:47.583900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.700 [2024-11-17 08:19:47.584031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.700 [2024-11-17 08:19:47.584048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.700 #23 NEW cov: 12395 ft: 15465 corp: 22/51b lim: 5 exec/s: 23 rss: 73Mb L: 3/5 MS: 1 CrossOver- 00:07:34.700 [2024-11-17 08:19:47.654610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.700 [2024-11-17 08:19:47.654637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.700 [2024-11-17 08:19:47.654773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.700 [2024-11-17 08:19:47.654790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.700 [2024-11-17 08:19:47.654923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.700 [2024-11-17 08:19:47.654940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.700 [2024-11-17 08:19:47.655067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.700 [2024-11-17 08:19:47.655084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.700 [2024-11-17 08:19:47.655210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.700 [2024-11-17 08:19:47.655227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:34.700 #24 NEW cov: 12395 ft: 15542 corp: 23/56b lim: 5 exec/s: 24 rss: 73Mb L: 5/5 MS: 1 CopyPart- 00:07:34.701 [2024-11-17 08:19:47.723943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.701 [2024-11-17 08:19:47.723973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.701 [2024-11-17 08:19:47.724100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.701 [2024-11-17 08:19:47.724115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.701 #25 NEW cov: 12395 ft: 15556 corp: 24/58b lim: 5 exec/s: 25 rss: 74Mb L: 2/5 MS: 1 CrossOver- 00:07:34.701 [2024-11-17 08:19:47.794194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.701 [2024-11-17 08:19:47.794224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.701 [2024-11-17 08:19:47.794375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.701 [2024-11-17 08:19:47.794392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.701 #26 NEW cov: 12395 ft: 15560 corp: 25/60b lim: 5 exec/s: 26 rss: 74Mb L: 2/5 MS: 1 CrossOver- 00:07:34.960 [2024-11-17 08:19:47.844616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.960 [2024-11-17 08:19:47.844644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.960 [2024-11-17 08:19:47.844774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.960 [2024-11-17 08:19:47.844792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.960 [2024-11-17 08:19:47.844948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.960 [2024-11-17 08:19:47.844966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.960 #27 NEW cov: 12395 ft: 15584 corp: 26/63b lim: 5 exec/s: 27 rss: 74Mb L: 3/5 MS: 1 ChangeByte- 00:07:34.960 [2024-11-17 08:19:47.914183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.960 [2024-11-17 08:19:47.914212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.960 #28 NEW cov: 12395 ft: 15592 corp: 27/64b lim: 5 exec/s: 28 rss: 74Mb L: 1/5 MS: 1 EraseBytes- 00:07:34.960 [2024-11-17 08:19:47.964681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.960 [2024-11-17 08:19:47.964715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.960 [2024-11-17 08:19:47.964859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.960 [2024-11-17 08:19:47.964877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.960 #29 NEW cov: 12395 ft: 15662 corp: 28/66b lim: 5 exec/s: 29 rss: 74Mb L: 2/5 MS: 1 ChangeByte- 00:07:34.960 [2024-11-17 08:19:48.014533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.960 [2024-11-17 08:19:48.014560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.960 #30 NEW cov: 12395 ft: 15690 corp: 29/67b lim: 5 exec/s: 30 rss: 74Mb L: 1/5 MS: 1 ChangeBit- 00:07:34.960 [2024-11-17 08:19:48.065100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.960 [2024-11-17 08:19:48.065127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.960 [2024-11-17 08:19:48.065282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.960 [2024-11-17 08:19:48.065300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.960 #31 NEW cov: 12395 ft: 15696 corp: 30/69b lim: 5 exec/s: 15 rss: 74Mb L: 2/5 MS: 1 ChangeBit- 00:07:34.960 #31 DONE cov: 12395 ft: 15696 corp: 30/69b lim: 5 exec/s: 15 rss: 74Mb 00:07:34.960 Done 31 runs in 2 second(s) 00:07:35.220 08:19:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_8.conf /var/tmp/suppress_nvmf_fuzz 00:07:35.220 08:19:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:35.220 08:19:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:35.220 08:19:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:35.220 08:19:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:35.220 08:19:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:35.220 08:19:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:35.220 08:19:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:35.220 08:19:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:35.220 08:19:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:35.220 08:19:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:35.220 08:19:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 9 00:07:35.220 08:19:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4409 00:07:35.220 08:19:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:35.220 08:19:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:35.220 08:19:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:35.220 08:19:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:35.220 08:19:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:35.220 08:19:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 00:07:35.220 [2024-11-17 08:19:48.237599] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:35.220 [2024-11-17 08:19:48.237665] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid993487 ] 00:07:35.480 [2024-11-17 08:19:48.411419] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.480 [2024-11-17 08:19:48.432758] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.480 [2024-11-17 08:19:48.485023] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:35.480 [2024-11-17 08:19:48.501320] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:35.480 INFO: Running with entropic power schedule (0xFF, 100). 00:07:35.480 INFO: Seed: 1135669962 00:07:35.480 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:35.480 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:35.480 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:35.480 INFO: A corpus is not provided, starting from an empty corpus 00:07:35.480 [2024-11-17 08:19:48.546128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.480 [2024-11-17 08:19:48.546163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.480 #2 INITED cov: 12168 ft: 12167 corp: 1/1b exec/s: 0 rss: 70Mb 00:07:35.480 [2024-11-17 08:19:48.596179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.480 [2024-11-17 08:19:48.596210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.480 [2024-11-17 08:19:48.596242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.480 [2024-11-17 08:19:48.596262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.739 #3 NEW cov: 12281 ft: 13533 corp: 2/3b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 CrossOver- 00:07:35.740 [2024-11-17 08:19:48.686421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.740 [2024-11-17 08:19:48.686451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.740 [2024-11-17 08:19:48.686483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.740 [2024-11-17 08:19:48.686498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.740 #4 NEW cov: 12287 ft: 13776 corp: 3/5b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 ChangeByte- 00:07:35.740 [2024-11-17 08:19:48.776562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.740 [2024-11-17 08:19:48.776592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.740 #5 NEW cov: 12372 ft: 14003 corp: 4/6b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 ShuffleBytes- 00:07:35.740 [2024-11-17 08:19:48.836942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.740 [2024-11-17 08:19:48.836973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.740 [2024-11-17 08:19:48.837007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.740 [2024-11-17 08:19:48.837023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.740 [2024-11-17 08:19:48.837052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.740 [2024-11-17 08:19:48.837068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.740 [2024-11-17 08:19:48.837096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.740 [2024-11-17 08:19:48.837111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.999 #6 NEW cov: 12372 ft: 14335 corp: 5/10b lim: 5 exec/s: 0 rss: 71Mb L: 4/4 MS: 1 CMP- DE: "\020\000"- 00:07:35.999 [2024-11-17 08:19:48.897073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.999 [2024-11-17 08:19:48.897105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.999 [2024-11-17 08:19:48.897139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.999 [2024-11-17 08:19:48.897155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.999 [2024-11-17 08:19:48.897184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.999 [2024-11-17 08:19:48.897201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.999 #7 NEW cov: 12372 ft: 14604 corp: 6/13b lim: 5 exec/s: 0 rss: 71Mb L: 3/4 MS: 1 EraseBytes- 00:07:35.999 [2024-11-17 08:19:48.987315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.999 [2024-11-17 08:19:48.987346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.999 [2024-11-17 08:19:48.987378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.999 [2024-11-17 08:19:48.987393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.999 #8 NEW cov: 12372 ft: 14679 corp: 7/15b lim: 5 exec/s: 0 rss: 71Mb L: 2/4 MS: 1 CrossOver- 00:07:35.999 [2024-11-17 08:19:49.047494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.999 [2024-11-17 08:19:49.047524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.999 [2024-11-17 08:19:49.047557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.999 [2024-11-17 08:19:49.047571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.999 [2024-11-17 08:19:49.047599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.999 [2024-11-17 08:19:49.047615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.999 #9 NEW cov: 12372 ft: 14707 corp: 8/18b lim: 5 exec/s: 0 rss: 71Mb L: 3/4 MS: 1 InsertByte- 00:07:36.270 [2024-11-17 08:19:49.137828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.270 [2024-11-17 08:19:49.137860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.270 [2024-11-17 08:19:49.137894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.270 [2024-11-17 08:19:49.137911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.270 [2024-11-17 08:19:49.137941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.270 [2024-11-17 08:19:49.137957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.270 #10 NEW cov: 12372 ft: 14738 corp: 9/21b lim: 5 exec/s: 0 rss: 71Mb L: 3/4 MS: 1 ChangeBit- 00:07:36.270 [2024-11-17 08:19:49.227976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.270 [2024-11-17 08:19:49.228007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.270 [2024-11-17 08:19:49.228039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.270 [2024-11-17 08:19:49.228054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.270 #11 NEW cov: 12372 ft: 14816 corp: 10/23b lim: 5 exec/s: 0 rss: 71Mb L: 2/4 MS: 1 ChangeBit- 00:07:36.270 [2024-11-17 08:19:49.318329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.270 [2024-11-17 08:19:49.318361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.270 [2024-11-17 08:19:49.318395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.270 [2024-11-17 08:19:49.318411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.270 [2024-11-17 08:19:49.318441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.270 [2024-11-17 08:19:49.318457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.270 [2024-11-17 08:19:49.318486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.270 [2024-11-17 08:19:49.318502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.270 #12 NEW cov: 12372 ft: 14883 corp: 11/27b lim: 5 exec/s: 0 rss: 71Mb L: 4/4 MS: 1 ChangeByte- 00:07:36.270 [2024-11-17 08:19:49.378334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.270 [2024-11-17 08:19:49.378366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.270 [2024-11-17 08:19:49.378399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.270 [2024-11-17 08:19:49.378415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.788 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:36.788 #13 NEW cov: 12395 ft: 14920 corp: 12/29b lim: 5 exec/s: 13 rss: 73Mb L: 2/4 MS: 1 ChangeBit- 00:07:36.788 [2024-11-17 08:19:49.729358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.788 [2024-11-17 08:19:49.729395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.788 [2024-11-17 08:19:49.729429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.788 [2024-11-17 08:19:49.729445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.788 #14 NEW cov: 12395 ft: 14961 corp: 13/31b lim: 5 exec/s: 14 rss: 73Mb L: 2/4 MS: 1 ChangeBit- 00:07:36.788 [2024-11-17 08:19:49.789443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.788 [2024-11-17 08:19:49.789475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.788 [2024-11-17 08:19:49.789509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.788 [2024-11-17 08:19:49.789525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.788 [2024-11-17 08:19:49.789555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.788 [2024-11-17 08:19:49.789575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.788 #15 NEW cov: 12395 ft: 14994 corp: 14/34b lim: 5 exec/s: 15 rss: 73Mb L: 3/4 MS: 1 CrossOver- 00:07:36.788 [2024-11-17 08:19:49.879817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.788 [2024-11-17 08:19:49.879849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.788 [2024-11-17 08:19:49.879882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.788 [2024-11-17 08:19:49.879898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.788 [2024-11-17 08:19:49.879928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.788 [2024-11-17 08:19:49.879944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.788 [2024-11-17 08:19:49.879973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.788 [2024-11-17 08:19:49.879988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.788 [2024-11-17 08:19:49.880016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.788 [2024-11-17 08:19:49.880031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.788 #16 NEW cov: 12395 ft: 15119 corp: 15/39b lim: 5 exec/s: 16 rss: 73Mb L: 5/5 MS: 1 CrossOver- 00:07:37.048 [2024-11-17 08:19:49.939713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.048 [2024-11-17 08:19:49.939744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.048 #17 NEW cov: 12395 ft: 15178 corp: 16/40b lim: 5 exec/s: 17 rss: 73Mb L: 1/5 MS: 1 CopyPart- 00:07:37.048 [2024-11-17 08:19:50.001094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.048 [2024-11-17 08:19:50.001121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.048 [2024-11-17 08:19:50.001176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.048 [2024-11-17 08:19:50.001190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.048 [2024-11-17 08:19:50.001239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.048 [2024-11-17 08:19:50.001253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.048 [2024-11-17 08:19:50.001307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.049 [2024-11-17 08:19:50.001321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.049 [2024-11-17 08:19:50.001377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.049 [2024-11-17 08:19:50.001394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.049 #18 NEW cov: 12395 ft: 15245 corp: 17/45b lim: 5 exec/s: 18 rss: 73Mb L: 5/5 MS: 1 PersAutoDict- DE: "\020\000"- 00:07:37.049 [2024-11-17 08:19:50.061123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.049 [2024-11-17 08:19:50.061150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.049 [2024-11-17 08:19:50.061202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.049 [2024-11-17 08:19:50.061216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.049 [2024-11-17 08:19:50.061270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.049 [2024-11-17 08:19:50.061284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.049 [2024-11-17 08:19:50.061337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.049 [2024-11-17 08:19:50.061351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.049 #19 NEW cov: 12395 ft: 15284 corp: 18/49b lim: 5 exec/s: 19 rss: 73Mb L: 4/5 MS: 1 InsertByte- 00:07:37.049 [2024-11-17 08:19:50.121436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.049 [2024-11-17 08:19:50.121463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.049 [2024-11-17 08:19:50.121516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.049 [2024-11-17 08:19:50.121530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.049 [2024-11-17 08:19:50.121585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.049 [2024-11-17 08:19:50.121599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.049 [2024-11-17 08:19:50.121653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.049 [2024-11-17 08:19:50.121666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.049 [2024-11-17 08:19:50.121722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.049 [2024-11-17 08:19:50.121736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.049 #20 NEW cov: 12395 ft: 15293 corp: 19/54b lim: 5 exec/s: 20 rss: 73Mb L: 5/5 MS: 1 ChangeBinInt- 00:07:37.049 [2024-11-17 08:19:50.181557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.049 [2024-11-17 08:19:50.181583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.049 [2024-11-17 08:19:50.181638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.049 [2024-11-17 08:19:50.181656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.049 [2024-11-17 08:19:50.181709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.049 [2024-11-17 08:19:50.181724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.049 [2024-11-17 08:19:50.181777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.049 [2024-11-17 08:19:50.181791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.049 [2024-11-17 08:19:50.181845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.049 [2024-11-17 08:19:50.181858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.308 #21 NEW cov: 12395 ft: 15307 corp: 20/59b lim: 5 exec/s: 21 rss: 73Mb L: 5/5 MS: 1 ChangeByte- 00:07:37.308 [2024-11-17 08:19:50.241709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.308 [2024-11-17 08:19:50.241751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.308 [2024-11-17 08:19:50.241806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.308 [2024-11-17 08:19:50.241820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.308 [2024-11-17 08:19:50.241873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.308 [2024-11-17 08:19:50.241887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.308 [2024-11-17 08:19:50.241940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.308 [2024-11-17 08:19:50.241954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.308 [2024-11-17 08:19:50.242007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.308 [2024-11-17 08:19:50.242021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.308 #22 NEW cov: 12395 ft: 15326 corp: 21/64b lim: 5 exec/s: 22 rss: 73Mb L: 5/5 MS: 1 CrossOver- 00:07:37.308 [2024-11-17 08:19:50.301728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.308 [2024-11-17 08:19:50.301765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.308 [2024-11-17 08:19:50.301820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.308 [2024-11-17 08:19:50.301834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.308 [2024-11-17 08:19:50.301889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.308 [2024-11-17 08:19:50.301905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.308 [2024-11-17 08:19:50.301958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.308 [2024-11-17 08:19:50.301971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.308 #23 NEW cov: 12395 ft: 15354 corp: 22/68b lim: 5 exec/s: 23 rss: 73Mb L: 4/5 MS: 1 ChangeByte- 00:07:37.308 [2024-11-17 08:19:50.341644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.308 [2024-11-17 08:19:50.341671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.308 [2024-11-17 08:19:50.341727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.308 [2024-11-17 08:19:50.341758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.308 [2024-11-17 08:19:50.341815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.308 [2024-11-17 08:19:50.341829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.308 #24 NEW cov: 12395 ft: 15403 corp: 23/71b lim: 5 exec/s: 24 rss: 73Mb L: 3/5 MS: 1 ChangeBit- 00:07:37.308 [2024-11-17 08:19:50.381938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.308 [2024-11-17 08:19:50.381963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.308 [2024-11-17 08:19:50.382016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.308 [2024-11-17 08:19:50.382030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.308 [2024-11-17 08:19:50.382083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.309 [2024-11-17 08:19:50.382097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.309 [2024-11-17 08:19:50.382151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.309 [2024-11-17 08:19:50.382165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.309 #25 NEW cov: 12395 ft: 15420 corp: 24/75b lim: 5 exec/s: 25 rss: 73Mb L: 4/5 MS: 1 EraseBytes- 00:07:37.309 [2024-11-17 08:19:50.421893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.309 [2024-11-17 08:19:50.421919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.309 [2024-11-17 08:19:50.421972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.309 [2024-11-17 08:19:50.421986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.309 [2024-11-17 08:19:50.422058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.309 [2024-11-17 08:19:50.422073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.309 #26 NEW cov: 12396 ft: 15467 corp: 25/78b lim: 5 exec/s: 26 rss: 73Mb L: 3/5 MS: 1 CopyPart- 00:07:37.569 [2024-11-17 08:19:50.462196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.569 [2024-11-17 08:19:50.462223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.569 [2024-11-17 08:19:50.462276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.569 [2024-11-17 08:19:50.462290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.569 [2024-11-17 08:19:50.462343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.569 [2024-11-17 08:19:50.462357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.569 [2024-11-17 08:19:50.462408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.569 [2024-11-17 08:19:50.462422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.569 #27 NEW cov: 12396 ft: 15502 corp: 26/82b lim: 5 exec/s: 27 rss: 73Mb L: 4/5 MS: 1 PersAutoDict- DE: "\020\000"- 00:07:37.569 [2024-11-17 08:19:50.522178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.569 [2024-11-17 08:19:50.522205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.569 [2024-11-17 08:19:50.522259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.569 [2024-11-17 08:19:50.522273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.569 [2024-11-17 08:19:50.522328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.569 [2024-11-17 08:19:50.522341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.569 #28 NEW cov: 12396 ft: 15540 corp: 27/85b lim: 5 exec/s: 14 rss: 73Mb L: 3/5 MS: 1 PersAutoDict- DE: "\020\000"- 00:07:37.569 #28 DONE cov: 12396 ft: 15540 corp: 27/85b lim: 5 exec/s: 14 rss: 73Mb 00:07:37.569 ###### Recommended dictionary. ###### 00:07:37.569 "\020\000" # Uses: 3 00:07:37.569 ###### End of recommended dictionary. ###### 00:07:37.569 Done 28 runs in 2 second(s) 00:07:37.569 08:19:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_9.conf /var/tmp/suppress_nvmf_fuzz 00:07:37.569 08:19:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:37.569 08:19:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:37.569 08:19:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:37.569 08:19:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:37.569 08:19:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:37.569 08:19:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:37.569 08:19:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:37.569 08:19:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:37.569 08:19:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:37.569 08:19:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:37.569 08:19:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 10 00:07:37.569 08:19:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4410 00:07:37.569 08:19:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:37.569 08:19:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:37.569 08:19:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:37.569 08:19:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:37.569 08:19:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:37.569 08:19:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 00:07:37.569 [2024-11-17 08:19:50.694525] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:37.569 [2024-11-17 08:19:50.694594] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid993886 ] 00:07:37.828 [2024-11-17 08:19:50.876545] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.828 [2024-11-17 08:19:50.899597] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.828 [2024-11-17 08:19:50.952093] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:38.088 [2024-11-17 08:19:50.968436] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:38.088 INFO: Running with entropic power schedule (0xFF, 100). 00:07:38.088 INFO: Seed: 3602641852 00:07:38.088 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:38.088 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:38.088 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:38.088 INFO: A corpus is not provided, starting from an empty corpus 00:07:38.088 #2 INITED exec/s: 0 rss: 65Mb 00:07:38.088 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:38.088 This may also happen if the target rejected all inputs we tried so far 00:07:38.088 [2024-11-17 08:19:51.023828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9f9f9f9f cdw11:9f9f9f0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.088 [2024-11-17 08:19:51.023856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.346 NEW_FUNC[1/714]: 0x466508 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:38.346 NEW_FUNC[2/714]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:38.346 #8 NEW cov: 12191 ft: 12190 corp: 2/9b lim: 40 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:07:38.346 [2024-11-17 08:19:51.354591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9f9f9ff8 cdw11:9f9f9f9f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.346 [2024-11-17 08:19:51.354620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.346 #9 NEW cov: 12304 ft: 12721 corp: 3/18b lim: 40 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 InsertByte- 00:07:38.346 [2024-11-17 08:19:51.414677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.346 [2024-11-17 08:19:51.414708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.346 #11 NEW cov: 12310 ft: 13026 corp: 4/28b lim: 40 exec/s: 0 rss: 72Mb L: 10/10 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:38.346 [2024-11-17 08:19:51.454758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:fcffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.346 [2024-11-17 08:19:51.454783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.346 #15 NEW cov: 12395 ft: 13258 corp: 5/36b lim: 40 exec/s: 0 rss: 72Mb L: 8/10 MS: 4 CrossOver-ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:07:38.605 [2024-11-17 08:19:51.494905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:fcffffff cdw11:2dffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.605 [2024-11-17 08:19:51.494931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.605 #16 NEW cov: 12395 ft: 13461 corp: 6/45b lim: 40 exec/s: 0 rss: 72Mb L: 9/10 MS: 1 InsertByte- 00:07:38.605 [2024-11-17 08:19:51.555061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:fcfffffc cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.605 [2024-11-17 08:19:51.555086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.605 #17 NEW cov: 12395 ft: 13528 corp: 7/54b lim: 40 exec/s: 0 rss: 72Mb L: 9/10 MS: 1 CrossOver- 00:07:38.605 [2024-11-17 08:19:51.615223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9f9f9ff8 cdw11:9f9ff89f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.605 [2024-11-17 08:19:51.615248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.605 #18 NEW cov: 12395 ft: 13556 corp: 8/67b lim: 40 exec/s: 0 rss: 72Mb L: 13/13 MS: 1 CopyPart- 00:07:38.605 [2024-11-17 08:19:51.675384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9f9f9f9f cdw11:9f9f8d0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.605 [2024-11-17 08:19:51.675409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.605 #24 NEW cov: 12395 ft: 13604 corp: 9/75b lim: 40 exec/s: 0 rss: 72Mb L: 8/13 MS: 1 CrossOver- 00:07:38.605 [2024-11-17 08:19:51.715500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9d9f9f9f cdw11:9f9f9f0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.605 [2024-11-17 08:19:51.715525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.605 #25 NEW cov: 12395 ft: 13728 corp: 10/83b lim: 40 exec/s: 0 rss: 72Mb L: 8/13 MS: 1 ChangeBit- 00:07:38.864 [2024-11-17 08:19:51.755596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9fbf9ff8 cdw11:9f9ff89f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.864 [2024-11-17 08:19:51.755621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.864 #26 NEW cov: 12395 ft: 13782 corp: 11/96b lim: 40 exec/s: 0 rss: 72Mb L: 13/13 MS: 1 ChangeBit- 00:07:38.864 [2024-11-17 08:19:51.815803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:fcfffffc cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.864 [2024-11-17 08:19:51.815829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.864 #27 NEW cov: 12395 ft: 13846 corp: 12/106b lim: 40 exec/s: 0 rss: 73Mb L: 10/13 MS: 1 InsertByte- 00:07:38.864 [2024-11-17 08:19:51.875962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9d9f419f cdw11:9f9f9f9f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.864 [2024-11-17 08:19:51.875988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.864 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:38.864 #28 NEW cov: 12418 ft: 13917 corp: 13/115b lim: 40 exec/s: 0 rss: 73Mb L: 9/13 MS: 1 InsertByte- 00:07:38.864 [2024-11-17 08:19:51.936123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9f9ff89f cdw11:9f9f9f9f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.864 [2024-11-17 08:19:51.936148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.864 #29 NEW cov: 12418 ft: 13957 corp: 14/124b lim: 40 exec/s: 0 rss: 73Mb L: 9/13 MS: 1 ShuffleBytes- 00:07:38.864 [2024-11-17 08:19:51.976248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:bf9f9f9f cdw11:f89ff89f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.864 [2024-11-17 08:19:51.976276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.123 #30 NEW cov: 12418 ft: 14030 corp: 15/137b lim: 40 exec/s: 30 rss: 73Mb L: 13/13 MS: 1 ShuffleBytes- 00:07:39.123 [2024-11-17 08:19:52.036426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:fc2dffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.123 [2024-11-17 08:19:52.036453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.123 #31 NEW cov: 12418 ft: 14053 corp: 16/146b lim: 40 exec/s: 31 rss: 73Mb L: 9/13 MS: 1 ShuffleBytes- 00:07:39.123 [2024-11-17 08:19:52.076508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9f9f9ff8 cdw11:9f9ff89f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.123 [2024-11-17 08:19:52.076534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.123 #32 NEW cov: 12418 ft: 14154 corp: 17/160b lim: 40 exec/s: 32 rss: 73Mb L: 14/14 MS: 1 InsertByte- 00:07:39.123 [2024-11-17 08:19:52.116592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:fc2dffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.123 [2024-11-17 08:19:52.116617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.123 #33 NEW cov: 12418 ft: 14173 corp: 18/169b lim: 40 exec/s: 33 rss: 73Mb L: 9/14 MS: 1 ShuffleBytes- 00:07:39.123 [2024-11-17 08:19:52.156752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8d8d8d8d cdw11:8d8d7a8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.123 [2024-11-17 08:19:52.156778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.123 #34 NEW cov: 12418 ft: 14186 corp: 19/179b lim: 40 exec/s: 34 rss: 73Mb L: 10/14 MS: 1 ChangeByte- 00:07:39.123 [2024-11-17 08:19:52.216972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:fc2dffff cdw11:2bffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.123 [2024-11-17 08:19:52.216998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.123 #35 NEW cov: 12418 ft: 14234 corp: 20/189b lim: 40 exec/s: 35 rss: 73Mb L: 10/14 MS: 1 InsertByte- 00:07:39.381 [2024-11-17 08:19:52.277086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:bf9f9f9f cdw11:f89ff89f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.381 [2024-11-17 08:19:52.277111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.381 #36 NEW cov: 12418 ft: 14254 corp: 21/202b lim: 40 exec/s: 36 rss: 73Mb L: 13/14 MS: 1 CopyPart- 00:07:39.381 [2024-11-17 08:19:52.337288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:409f9f9f cdw11:f89f9f9f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.381 [2024-11-17 08:19:52.337312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.381 #40 NEW cov: 12418 ft: 14260 corp: 22/214b lim: 40 exec/s: 40 rss: 73Mb L: 12/14 MS: 4 CopyPart-CrossOver-InsertByte-CrossOver- 00:07:39.381 [2024-11-17 08:19:52.377497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:bf9f9f9f cdw11:82000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.381 [2024-11-17 08:19:52.377521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.381 [2024-11-17 08:19:52.377582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f89ff89f cdw11:9f9f9f9f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.381 [2024-11-17 08:19:52.377596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.381 #41 NEW cov: 12418 ft: 14561 corp: 23/231b lim: 40 exec/s: 41 rss: 73Mb L: 17/17 MS: 1 CMP- DE: "\202\000\000\000"- 00:07:39.381 [2024-11-17 08:19:52.417497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:bc2dffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.381 [2024-11-17 08:19:52.417522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.381 #42 NEW cov: 12418 ft: 14571 corp: 24/240b lim: 40 exec/s: 42 rss: 73Mb L: 9/17 MS: 1 ChangeBit- 00:07:39.381 [2024-11-17 08:19:52.457973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:08a0a0a0 cdw11:a0a0a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.381 [2024-11-17 08:19:52.457998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.381 [2024-11-17 08:19:52.458054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:a0a0a0a0 cdw11:a0a0a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.381 [2024-11-17 08:19:52.458068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.381 [2024-11-17 08:19:52.458122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:a0a0a0a0 cdw11:a0a0a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.381 [2024-11-17 08:19:52.458135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.381 [2024-11-17 08:19:52.458191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:a0a0a0a0 cdw11:a0a0a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.381 [2024-11-17 08:19:52.458205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.381 #45 NEW cov: 12418 ft: 15107 corp: 25/279b lim: 40 exec/s: 45 rss: 73Mb L: 39/39 MS: 3 CrossOver-ChangeBinInt-InsertRepeatedBytes- 00:07:39.381 [2024-11-17 08:19:52.497775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9f9ff89f cdw11:9f9f9f9f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.381 [2024-11-17 08:19:52.497800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.381 #46 NEW cov: 12418 ft: 15127 corp: 26/288b lim: 40 exec/s: 46 rss: 73Mb L: 9/39 MS: 1 CrossOver- 00:07:39.639 [2024-11-17 08:19:52.537892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9f9f9ff8 cdw11:9f9f419f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.639 [2024-11-17 08:19:52.537920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.639 #47 NEW cov: 12418 ft: 15141 corp: 27/298b lim: 40 exec/s: 47 rss: 73Mb L: 10/39 MS: 1 InsertByte- 00:07:39.639 [2024-11-17 08:19:52.577971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9f9f9f9f cdw11:9f9f9f0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.639 [2024-11-17 08:19:52.577996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.639 #48 NEW cov: 12418 ft: 15152 corp: 28/307b lim: 40 exec/s: 48 rss: 73Mb L: 9/39 MS: 1 InsertByte- 00:07:39.639 [2024-11-17 08:19:52.618074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9f9f9ff8 cdw11:9f9ff89f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.639 [2024-11-17 08:19:52.618098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.639 #49 NEW cov: 12418 ft: 15155 corp: 29/320b lim: 40 exec/s: 49 rss: 73Mb L: 13/39 MS: 1 CrossOver- 00:07:39.639 [2024-11-17 08:19:52.658168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:fcfffffc cdw11:fffffff8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.639 [2024-11-17 08:19:52.658193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.639 #50 NEW cov: 12418 ft: 15174 corp: 30/330b lim: 40 exec/s: 50 rss: 73Mb L: 10/39 MS: 1 ChangeBinInt- 00:07:39.639 [2024-11-17 08:19:52.718365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:61606060 cdw11:606060f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.639 [2024-11-17 08:19:52.718390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.639 #51 NEW cov: 12418 ft: 15187 corp: 31/338b lim: 40 exec/s: 51 rss: 73Mb L: 8/39 MS: 1 ChangeBinInt- 00:07:39.639 [2024-11-17 08:19:52.758484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:bf9f9f9f cdw11:f89ff89f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.639 [2024-11-17 08:19:52.758509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.898 #52 NEW cov: 12418 ft: 15209 corp: 32/351b lim: 40 exec/s: 52 rss: 73Mb L: 13/39 MS: 1 ShuffleBytes- 00:07:39.898 [2024-11-17 08:19:52.798572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9f9ff89f cdw11:9f9f9f9f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.898 [2024-11-17 08:19:52.798597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.898 #53 NEW cov: 12418 ft: 15239 corp: 33/360b lim: 40 exec/s: 53 rss: 73Mb L: 9/39 MS: 1 CopyPart- 00:07:39.898 [2024-11-17 08:19:52.858732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:82000000 cdw11:9f9ff89f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.898 [2024-11-17 08:19:52.858756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.898 #54 NEW cov: 12418 ft: 15248 corp: 34/373b lim: 40 exec/s: 54 rss: 73Mb L: 13/39 MS: 1 PersAutoDict- DE: "\202\000\000\000"- 00:07:39.898 [2024-11-17 08:19:52.898828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:bc2dffff cdw11:ffffffbc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.898 [2024-11-17 08:19:52.898853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.898 #55 NEW cov: 12418 ft: 15252 corp: 35/387b lim: 40 exec/s: 55 rss: 74Mb L: 14/39 MS: 1 CopyPart- 00:07:39.898 [2024-11-17 08:19:52.959008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:fcffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.898 [2024-11-17 08:19:52.959037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.898 #56 NEW cov: 12418 ft: 15269 corp: 36/395b lim: 40 exec/s: 56 rss: 74Mb L: 8/39 MS: 1 CopyPart- 00:07:39.898 [2024-11-17 08:19:52.999117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:fc2dffff cdw11:fffffc2d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.898 [2024-11-17 08:19:52.999142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.158 #58 NEW cov: 12418 ft: 15280 corp: 37/407b lim: 40 exec/s: 29 rss: 74Mb L: 12/39 MS: 2 EraseBytes-CopyPart- 00:07:40.158 #58 DONE cov: 12418 ft: 15280 corp: 37/407b lim: 40 exec/s: 29 rss: 74Mb 00:07:40.158 ###### Recommended dictionary. ###### 00:07:40.158 "\202\000\000\000" # Uses: 1 00:07:40.158 ###### End of recommended dictionary. ###### 00:07:40.158 Done 58 runs in 2 second(s) 00:07:40.158 08:19:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_10.conf /var/tmp/suppress_nvmf_fuzz 00:07:40.158 08:19:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:40.158 08:19:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:40.158 08:19:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:07:40.158 08:19:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:07:40.158 08:19:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:40.158 08:19:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:40.158 08:19:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:40.158 08:19:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:07:40.158 08:19:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:40.158 08:19:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:40.158 08:19:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 11 00:07:40.158 08:19:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4411 00:07:40.158 08:19:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:40.158 08:19:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:07:40.158 08:19:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:40.158 08:19:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:40.158 08:19:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:40.158 08:19:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 00:07:40.158 [2024-11-17 08:19:53.192982] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:40.159 [2024-11-17 08:19:53.193050] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid994310 ] 00:07:40.418 [2024-11-17 08:19:53.364918] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.418 [2024-11-17 08:19:53.386412] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.418 [2024-11-17 08:19:53.439078] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:40.418 [2024-11-17 08:19:53.455406] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:07:40.418 INFO: Running with entropic power schedule (0xFF, 100). 00:07:40.418 INFO: Seed: 1794696908 00:07:40.418 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:40.418 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:40.418 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:40.418 INFO: A corpus is not provided, starting from an empty corpus 00:07:40.418 #2 INITED exec/s: 0 rss: 65Mb 00:07:40.418 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:40.418 This may also happen if the target rejected all inputs we tried so far 00:07:40.418 [2024-11-17 08:19:53.500423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:02cbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.418 [2024-11-17 08:19:53.500458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.418 [2024-11-17 08:19:53.500492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.418 [2024-11-17 08:19:53.500507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.418 [2024-11-17 08:19:53.500537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.418 [2024-11-17 08:19:53.500553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.418 [2024-11-17 08:19:53.500581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.418 [2024-11-17 08:19:53.500596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.937 NEW_FUNC[1/715]: 0x468278 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:07:40.937 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:40.937 #4 NEW cov: 12203 ft: 12202 corp: 2/36b lim: 40 exec/s: 0 rss: 72Mb L: 35/35 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:40.937 [2024-11-17 08:19:53.883608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:02cbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.937 [2024-11-17 08:19:53.883652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.937 [2024-11-17 08:19:53.883797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.937 [2024-11-17 08:19:53.883814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.937 [2024-11-17 08:19:53.883940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.937 [2024-11-17 08:19:53.883956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.937 [2024-11-17 08:19:53.884088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.937 [2024-11-17 08:19:53.884105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.937 #5 NEW cov: 12316 ft: 13006 corp: 3/72b lim: 40 exec/s: 0 rss: 72Mb L: 36/36 MS: 1 InsertByte- 00:07:40.937 [2024-11-17 08:19:53.953735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:02cbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.937 [2024-11-17 08:19:53.953767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.937 [2024-11-17 08:19:53.953902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.937 [2024-11-17 08:19:53.953920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.937 [2024-11-17 08:19:53.954049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.937 [2024-11-17 08:19:53.954066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.937 [2024-11-17 08:19:53.954210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.937 [2024-11-17 08:19:53.954228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.937 #6 NEW cov: 12322 ft: 13227 corp: 4/110b lim: 40 exec/s: 0 rss: 72Mb L: 38/38 MS: 1 CopyPart- 00:07:40.937 [2024-11-17 08:19:54.003973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.937 [2024-11-17 08:19:54.004005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.937 [2024-11-17 08:19:54.004134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.937 [2024-11-17 08:19:54.004151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.937 [2024-11-17 08:19:54.004275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.937 [2024-11-17 08:19:54.004291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.937 [2024-11-17 08:19:54.004436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.937 [2024-11-17 08:19:54.004453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.937 #8 NEW cov: 12407 ft: 13572 corp: 5/146b lim: 40 exec/s: 0 rss: 72Mb L: 36/38 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:40.937 [2024-11-17 08:19:54.054427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.937 [2024-11-17 08:19:54.054455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.937 [2024-11-17 08:19:54.054603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.937 [2024-11-17 08:19:54.054620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.937 [2024-11-17 08:19:54.054764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.937 [2024-11-17 08:19:54.054793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.937 [2024-11-17 08:19:54.054920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.937 [2024-11-17 08:19:54.054937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.937 [2024-11-17 08:19:54.055075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.938 [2024-11-17 08:19:54.055091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:41.197 #9 NEW cov: 12407 ft: 13770 corp: 6/186b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:07:41.197 [2024-11-17 08:19:54.124554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.197 [2024-11-17 08:19:54.124581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.197 [2024-11-17 08:19:54.124723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.197 [2024-11-17 08:19:54.124739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.197 [2024-11-17 08:19:54.124876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff00ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.197 [2024-11-17 08:19:54.124891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.197 [2024-11-17 08:19:54.125027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.197 [2024-11-17 08:19:54.125044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.197 [2024-11-17 08:19:54.125172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.197 [2024-11-17 08:19:54.125188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:41.197 #15 NEW cov: 12407 ft: 13838 corp: 7/226b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 ChangeBinInt- 00:07:41.197 [2024-11-17 08:19:54.194799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.197 [2024-11-17 08:19:54.194829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.197 [2024-11-17 08:19:54.194968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.197 [2024-11-17 08:19:54.194986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.197 [2024-11-17 08:19:54.195122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff00ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.197 [2024-11-17 08:19:54.195138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.197 [2024-11-17 08:19:54.195275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:3dffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.197 [2024-11-17 08:19:54.195292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.197 [2024-11-17 08:19:54.195424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.197 [2024-11-17 08:19:54.195442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:41.197 #16 NEW cov: 12407 ft: 13981 corp: 8/266b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 ChangeByte- 00:07:41.197 [2024-11-17 08:19:54.264760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:02cbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.197 [2024-11-17 08:19:54.264791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.197 [2024-11-17 08:19:54.264940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.197 [2024-11-17 08:19:54.264958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.197 [2024-11-17 08:19:54.265099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.197 [2024-11-17 08:19:54.265116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.197 [2024-11-17 08:19:54.265254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:cb76cbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.197 [2024-11-17 08:19:54.265274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.197 #17 NEW cov: 12407 ft: 14015 corp: 9/302b lim: 40 exec/s: 0 rss: 72Mb L: 36/40 MS: 1 ChangeByte- 00:07:41.457 [2024-11-17 08:19:54.335027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:02cbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.457 [2024-11-17 08:19:54.335058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.457 [2024-11-17 08:19:54.335192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.457 [2024-11-17 08:19:54.335208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.457 [2024-11-17 08:19:54.335346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.457 [2024-11-17 08:19:54.335363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.457 [2024-11-17 08:19:54.335502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.457 [2024-11-17 08:19:54.335520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.457 #18 NEW cov: 12407 ft: 14042 corp: 10/335b lim: 40 exec/s: 0 rss: 72Mb L: 33/40 MS: 1 EraseBytes- 00:07:41.457 [2024-11-17 08:19:54.405102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:027ecbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.457 [2024-11-17 08:19:54.405131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.457 [2024-11-17 08:19:54.405262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.457 [2024-11-17 08:19:54.405279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.457 [2024-11-17 08:19:54.405419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.457 [2024-11-17 08:19:54.405437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.457 [2024-11-17 08:19:54.405571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:cbcb76cb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.457 [2024-11-17 08:19:54.405591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.457 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:41.457 #19 NEW cov: 12430 ft: 14146 corp: 11/372b lim: 40 exec/s: 0 rss: 73Mb L: 37/40 MS: 1 InsertByte- 00:07:41.457 [2024-11-17 08:19:54.475350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0260cbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.457 [2024-11-17 08:19:54.475376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.457 [2024-11-17 08:19:54.475528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.457 [2024-11-17 08:19:54.475547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.457 [2024-11-17 08:19:54.475686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.457 [2024-11-17 08:19:54.475705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.457 [2024-11-17 08:19:54.475853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.457 [2024-11-17 08:19:54.475871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.457 #20 NEW cov: 12430 ft: 14164 corp: 12/406b lim: 40 exec/s: 20 rss: 73Mb L: 34/40 MS: 1 InsertByte- 00:07:41.457 [2024-11-17 08:19:54.545615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.457 [2024-11-17 08:19:54.545643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.457 [2024-11-17 08:19:54.545782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.457 [2024-11-17 08:19:54.545800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.457 [2024-11-17 08:19:54.545933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.457 [2024-11-17 08:19:54.545949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.457 [2024-11-17 08:19:54.546094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.457 [2024-11-17 08:19:54.546111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.457 #21 NEW cov: 12430 ft: 14219 corp: 13/440b lim: 40 exec/s: 21 rss: 73Mb L: 34/40 MS: 1 EraseBytes- 00:07:41.718 [2024-11-17 08:19:54.595426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.718 [2024-11-17 08:19:54.595454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.718 [2024-11-17 08:19:54.595597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.718 [2024-11-17 08:19:54.595614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.718 [2024-11-17 08:19:54.595753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.718 [2024-11-17 08:19:54.595770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.718 #22 NEW cov: 12430 ft: 14584 corp: 14/464b lim: 40 exec/s: 22 rss: 73Mb L: 24/40 MS: 1 EraseBytes- 00:07:41.718 [2024-11-17 08:19:54.645233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:5e5e5e5e cdw11:5e5e5e5e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.718 [2024-11-17 08:19:54.645260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.718 [2024-11-17 08:19:54.645402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:5e5e5e5e cdw11:5e5e5e5e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.718 [2024-11-17 08:19:54.645418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.718 #25 NEW cov: 12430 ft: 14877 corp: 15/486b lim: 40 exec/s: 25 rss: 73Mb L: 22/40 MS: 3 ChangeBit-InsertByte-InsertRepeatedBytes- 00:07:41.718 [2024-11-17 08:19:54.695071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.718 [2024-11-17 08:19:54.695099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.718 #26 NEW cov: 12430 ft: 15610 corp: 16/500b lim: 40 exec/s: 26 rss: 73Mb L: 14/40 MS: 1 CrossOver- 00:07:41.718 [2024-11-17 08:19:54.766135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.718 [2024-11-17 08:19:54.766163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.718 [2024-11-17 08:19:54.766327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.718 [2024-11-17 08:19:54.766344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.718 [2024-11-17 08:19:54.766481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.718 [2024-11-17 08:19:54.766500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.718 [2024-11-17 08:19:54.766631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.718 [2024-11-17 08:19:54.766647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.718 #27 NEW cov: 12430 ft: 15650 corp: 17/539b lim: 40 exec/s: 27 rss: 73Mb L: 39/40 MS: 1 CopyPart- 00:07:41.718 [2024-11-17 08:19:54.816282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.718 [2024-11-17 08:19:54.816309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.718 [2024-11-17 08:19:54.816446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.718 [2024-11-17 08:19:54.816463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.718 [2024-11-17 08:19:54.816603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.718 [2024-11-17 08:19:54.816623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.718 [2024-11-17 08:19:54.816765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.719 [2024-11-17 08:19:54.816781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.719 #28 NEW cov: 12430 ft: 15677 corp: 18/578b lim: 40 exec/s: 28 rss: 73Mb L: 39/40 MS: 1 CopyPart- 00:07:41.978 [2024-11-17 08:19:54.886906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.978 [2024-11-17 08:19:54.886935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.978 [2024-11-17 08:19:54.887070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.978 [2024-11-17 08:19:54.887086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.978 [2024-11-17 08:19:54.887227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.978 [2024-11-17 08:19:54.887242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.978 [2024-11-17 08:19:54.887378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.978 [2024-11-17 08:19:54.887394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.978 [2024-11-17 08:19:54.887522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.978 [2024-11-17 08:19:54.887537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:41.978 #29 NEW cov: 12430 ft: 15709 corp: 19/618b lim: 40 exec/s: 29 rss: 73Mb L: 40/40 MS: 1 CrossOver- 00:07:41.978 [2024-11-17 08:19:54.937017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.978 [2024-11-17 08:19:54.937044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.978 [2024-11-17 08:19:54.937179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.978 [2024-11-17 08:19:54.937198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.978 [2024-11-17 08:19:54.937337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.978 [2024-11-17 08:19:54.937355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.978 [2024-11-17 08:19:54.937488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.978 [2024-11-17 08:19:54.937507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.978 [2024-11-17 08:19:54.937645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:fffdff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.978 [2024-11-17 08:19:54.937660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:41.978 #30 NEW cov: 12430 ft: 15741 corp: 20/658b lim: 40 exec/s: 30 rss: 73Mb L: 40/40 MS: 1 ChangeBit- 00:07:41.978 [2024-11-17 08:19:55.006086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fffff7ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.978 [2024-11-17 08:19:55.006114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.978 #31 NEW cov: 12430 ft: 15777 corp: 21/672b lim: 40 exec/s: 31 rss: 73Mb L: 14/40 MS: 1 ChangeBit- 00:07:41.978 [2024-11-17 08:19:55.076262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fffffff7 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.978 [2024-11-17 08:19:55.076293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.238 #32 NEW cov: 12430 ft: 15804 corp: 22/686b lim: 40 exec/s: 32 rss: 73Mb L: 14/40 MS: 1 ShuffleBytes- 00:07:42.238 [2024-11-17 08:19:55.147789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.238 [2024-11-17 08:19:55.147818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.238 [2024-11-17 08:19:55.147954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.238 [2024-11-17 08:19:55.147973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.238 [2024-11-17 08:19:55.148113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.238 [2024-11-17 08:19:55.148132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.238 [2024-11-17 08:19:55.148276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fff7ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.238 [2024-11-17 08:19:55.148292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.238 [2024-11-17 08:19:55.148433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.238 [2024-11-17 08:19:55.148449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:42.238 #33 NEW cov: 12430 ft: 15821 corp: 23/726b lim: 40 exec/s: 33 rss: 73Mb L: 40/40 MS: 1 CrossOver- 00:07:42.238 [2024-11-17 08:19:55.197898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff2cffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.238 [2024-11-17 08:19:55.197932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.238 [2024-11-17 08:19:55.198056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.238 [2024-11-17 08:19:55.198075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.238 [2024-11-17 08:19:55.198211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff00ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.238 [2024-11-17 08:19:55.198227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.238 [2024-11-17 08:19:55.198349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:3dffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.238 [2024-11-17 08:19:55.198369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.238 [2024-11-17 08:19:55.198496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.238 [2024-11-17 08:19:55.198515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:42.238 #34 NEW cov: 12430 ft: 15851 corp: 24/766b lim: 40 exec/s: 34 rss: 73Mb L: 40/40 MS: 1 ChangeByte- 00:07:42.238 [2024-11-17 08:19:55.266877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fffff7ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.238 [2024-11-17 08:19:55.266906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.238 #35 NEW cov: 12430 ft: 15854 corp: 25/776b lim: 40 exec/s: 35 rss: 73Mb L: 10/40 MS: 1 EraseBytes- 00:07:42.238 [2024-11-17 08:19:55.316959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fffffff7 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.239 [2024-11-17 08:19:55.316988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.239 #36 NEW cov: 12430 ft: 15868 corp: 26/790b lim: 40 exec/s: 36 rss: 74Mb L: 14/40 MS: 1 ChangeBit- 00:07:42.498 [2024-11-17 08:19:55.388166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0260cbcb cdw11:cbcb3534 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.498 [2024-11-17 08:19:55.388195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.498 [2024-11-17 08:19:55.388327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:34343434 cdw11:3433cbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.498 [2024-11-17 08:19:55.388344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.498 [2024-11-17 08:19:55.388474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.498 [2024-11-17 08:19:55.388492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.498 [2024-11-17 08:19:55.388629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:cbcbcbcb cdw11:cbcbcbcb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.498 [2024-11-17 08:19:55.388647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.498 #37 NEW cov: 12430 ft: 15879 corp: 27/824b lim: 40 exec/s: 37 rss: 74Mb L: 34/40 MS: 1 ChangeBinInt- 00:07:42.498 [2024-11-17 08:19:55.457449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.498 [2024-11-17 08:19:55.457478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.498 #38 NEW cov: 12430 ft: 15892 corp: 28/838b lim: 40 exec/s: 19 rss: 74Mb L: 14/40 MS: 1 CMP- DE: "\377\377"- 00:07:42.498 #38 DONE cov: 12430 ft: 15892 corp: 28/838b lim: 40 exec/s: 19 rss: 74Mb 00:07:42.498 ###### Recommended dictionary. ###### 00:07:42.498 "\377\377" # Uses: 0 00:07:42.498 ###### End of recommended dictionary. ###### 00:07:42.498 Done 38 runs in 2 second(s) 00:07:42.498 08:19:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_11.conf /var/tmp/suppress_nvmf_fuzz 00:07:42.498 08:19:55 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:42.498 08:19:55 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.498 08:19:55 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:07:42.498 08:19:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:07:42.498 08:19:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:42.498 08:19:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:42.498 08:19:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:42.498 08:19:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:07:42.498 08:19:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:42.498 08:19:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:42.498 08:19:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 12 00:07:42.498 08:19:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4412 00:07:42.498 08:19:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:42.498 08:19:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:07:42.498 08:19:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:42.498 08:19:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:42.498 08:19:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:42.498 08:19:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 00:07:42.757 [2024-11-17 08:19:55.653801] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:42.757 [2024-11-17 08:19:55.653885] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid994836 ] 00:07:42.757 [2024-11-17 08:19:55.831439] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.757 [2024-11-17 08:19:55.853927] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.016 [2024-11-17 08:19:55.906434] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:43.016 [2024-11-17 08:19:55.922733] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:07:43.016 INFO: Running with entropic power schedule (0xFF, 100). 00:07:43.016 INFO: Seed: 4261686890 00:07:43.016 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:43.016 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:43.016 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:43.016 INFO: A corpus is not provided, starting from an empty corpus 00:07:43.016 #2 INITED exec/s: 0 rss: 66Mb 00:07:43.016 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:43.016 This may also happen if the target rejected all inputs we tried so far 00:07:43.016 [2024-11-17 08:19:55.978374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.016 [2024-11-17 08:19:55.978402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.016 [2024-11-17 08:19:55.978458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.016 [2024-11-17 08:19:55.978472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.275 NEW_FUNC[1/715]: 0x469fe8 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:07:43.275 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:43.275 #3 NEW cov: 12201 ft: 12200 corp: 2/22b lim: 40 exec/s: 0 rss: 72Mb L: 21/21 MS: 1 InsertRepeatedBytes- 00:07:43.275 [2024-11-17 08:19:56.289085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.275 [2024-11-17 08:19:56.289116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.275 [2024-11-17 08:19:56.289173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.275 [2024-11-17 08:19:56.289187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.275 #4 NEW cov: 12314 ft: 12829 corp: 3/41b lim: 40 exec/s: 0 rss: 72Mb L: 19/21 MS: 1 EraseBytes- 00:07:43.275 [2024-11-17 08:19:56.349217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.275 [2024-11-17 08:19:56.349243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.275 [2024-11-17 08:19:56.349299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.275 [2024-11-17 08:19:56.349314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.275 #10 NEW cov: 12320 ft: 13010 corp: 4/64b lim: 40 exec/s: 0 rss: 72Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:07:43.275 [2024-11-17 08:19:56.409349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.275 [2024-11-17 08:19:56.409375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.275 [2024-11-17 08:19:56.409431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.275 [2024-11-17 08:19:56.409445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.534 #11 NEW cov: 12405 ft: 13234 corp: 5/87b lim: 40 exec/s: 0 rss: 72Mb L: 23/23 MS: 1 CMP- DE: "\004\000"- 00:07:43.534 [2024-11-17 08:19:56.469505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.534 [2024-11-17 08:19:56.469531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.534 [2024-11-17 08:19:56.469588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.534 [2024-11-17 08:19:56.469603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.534 #12 NEW cov: 12405 ft: 13477 corp: 6/108b lim: 40 exec/s: 0 rss: 72Mb L: 21/23 MS: 1 CopyPart- 00:07:43.534 [2024-11-17 08:19:56.509449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.534 [2024-11-17 08:19:56.509475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.534 #13 NEW cov: 12405 ft: 14258 corp: 7/119b lim: 40 exec/s: 0 rss: 72Mb L: 11/23 MS: 1 EraseBytes- 00:07:43.534 [2024-11-17 08:19:56.549742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.534 [2024-11-17 08:19:56.549788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.534 [2024-11-17 08:19:56.549848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.534 [2024-11-17 08:19:56.549869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.534 #14 NEW cov: 12405 ft: 14322 corp: 8/142b lim: 40 exec/s: 0 rss: 72Mb L: 23/23 MS: 1 ChangeBit- 00:07:43.534 [2024-11-17 08:19:56.589820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.534 [2024-11-17 08:19:56.589847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.534 [2024-11-17 08:19:56.589901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.534 [2024-11-17 08:19:56.589916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.534 #15 NEW cov: 12405 ft: 14366 corp: 9/161b lim: 40 exec/s: 0 rss: 72Mb L: 19/23 MS: 1 ShuffleBytes- 00:07:43.534 [2024-11-17 08:19:56.629801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.534 [2024-11-17 08:19:56.629828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.534 #16 NEW cov: 12405 ft: 14416 corp: 10/176b lim: 40 exec/s: 0 rss: 73Mb L: 15/23 MS: 1 EraseBytes- 00:07:43.793 [2024-11-17 08:19:56.690125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.793 [2024-11-17 08:19:56.690151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.793 [2024-11-17 08:19:56.690206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.793 [2024-11-17 08:19:56.690221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.793 #17 NEW cov: 12405 ft: 14456 corp: 11/197b lim: 40 exec/s: 0 rss: 73Mb L: 21/23 MS: 1 ChangeBit- 00:07:43.793 [2024-11-17 08:19:56.730355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.793 [2024-11-17 08:19:56.730381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.793 [2024-11-17 08:19:56.730437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000000d1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.793 [2024-11-17 08:19:56.730450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.793 [2024-11-17 08:19:56.730507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:d1d10000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.793 [2024-11-17 08:19:56.730521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.793 #18 NEW cov: 12405 ft: 14702 corp: 12/221b lim: 40 exec/s: 0 rss: 73Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:07:43.793 [2024-11-17 08:19:56.790393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00010000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.793 [2024-11-17 08:19:56.790418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.793 [2024-11-17 08:19:56.790476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.793 [2024-11-17 08:19:56.790490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.793 #19 NEW cov: 12405 ft: 14773 corp: 13/242b lim: 40 exec/s: 0 rss: 73Mb L: 21/24 MS: 1 ChangeBit- 00:07:43.793 [2024-11-17 08:19:56.830683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.793 [2024-11-17 08:19:56.830714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.793 [2024-11-17 08:19:56.830772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.793 [2024-11-17 08:19:56.830787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.793 [2024-11-17 08:19:56.830844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.793 [2024-11-17 08:19:56.830858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.793 #20 NEW cov: 12405 ft: 14824 corp: 14/272b lim: 40 exec/s: 0 rss: 73Mb L: 30/30 MS: 1 CopyPart- 00:07:43.793 [2024-11-17 08:19:56.870639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.793 [2024-11-17 08:19:56.870666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.793 [2024-11-17 08:19:56.870728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.793 [2024-11-17 08:19:56.870743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.793 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:43.793 #21 NEW cov: 12428 ft: 14857 corp: 15/295b lim: 40 exec/s: 0 rss: 73Mb L: 23/30 MS: 1 PersAutoDict- DE: "\004\000"- 00:07:43.793 [2024-11-17 08:19:56.910736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.793 [2024-11-17 08:19:56.910762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.793 [2024-11-17 08:19:56.910821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.793 [2024-11-17 08:19:56.910835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.053 #22 NEW cov: 12428 ft: 14931 corp: 16/318b lim: 40 exec/s: 0 rss: 73Mb L: 23/30 MS: 1 PersAutoDict- DE: "\004\000"- 00:07:44.053 [2024-11-17 08:19:56.970948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.053 [2024-11-17 08:19:56.970973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.053 [2024-11-17 08:19:56.971029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00040000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.053 [2024-11-17 08:19:56.971043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.053 #23 NEW cov: 12428 ft: 14954 corp: 17/341b lim: 40 exec/s: 23 rss: 73Mb L: 23/30 MS: 1 ShuffleBytes- 00:07:44.053 [2024-11-17 08:19:57.031089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.053 [2024-11-17 08:19:57.031115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.053 [2024-11-17 08:19:57.031173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00feff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.053 [2024-11-17 08:19:57.031186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.053 #24 NEW cov: 12428 ft: 14985 corp: 18/364b lim: 40 exec/s: 24 rss: 73Mb L: 23/30 MS: 1 ChangeBinInt- 00:07:44.053 [2024-11-17 08:19:57.071194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.053 [2024-11-17 08:19:57.071221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.053 [2024-11-17 08:19:57.071281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:04007a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.053 [2024-11-17 08:19:57.071294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.053 #25 NEW cov: 12428 ft: 15000 corp: 19/387b lim: 40 exec/s: 25 rss: 73Mb L: 23/30 MS: 1 ChangeByte- 00:07:44.053 [2024-11-17 08:19:57.131374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.053 [2024-11-17 08:19:57.131400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.053 [2024-11-17 08:19:57.131454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0000feff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.053 [2024-11-17 08:19:57.131468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.053 #30 NEW cov: 12428 ft: 15010 corp: 20/405b lim: 40 exec/s: 30 rss: 73Mb L: 18/30 MS: 5 ChangeByte-ChangeBit-ShuffleBytes-ChangeBit-CrossOver- 00:07:44.053 [2024-11-17 08:19:57.171421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.053 [2024-11-17 08:19:57.171447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.053 [2024-11-17 08:19:57.171503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:02000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.053 [2024-11-17 08:19:57.171517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.312 #31 NEW cov: 12428 ft: 15026 corp: 21/427b lim: 40 exec/s: 31 rss: 73Mb L: 22/30 MS: 1 EraseBytes- 00:07:44.312 [2024-11-17 08:19:57.211878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.312 [2024-11-17 08:19:57.211904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.312 [2024-11-17 08:19:57.211962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.312 [2024-11-17 08:19:57.211976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.312 [2024-11-17 08:19:57.212030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.312 [2024-11-17 08:19:57.212046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.313 [2024-11-17 08:19:57.212102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00feff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.313 [2024-11-17 08:19:57.212115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.313 #32 NEW cov: 12428 ft: 15348 corp: 22/466b lim: 40 exec/s: 32 rss: 73Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:44.313 [2024-11-17 08:19:57.271632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.313 [2024-11-17 08:19:57.271657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.313 #33 NEW cov: 12428 ft: 15367 corp: 23/476b lim: 40 exec/s: 33 rss: 73Mb L: 10/39 MS: 1 CrossOver- 00:07:44.313 [2024-11-17 08:19:57.311863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.313 [2024-11-17 08:19:57.311889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.313 [2024-11-17 08:19:57.311945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0000fe04 cdw11:00ff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.313 [2024-11-17 08:19:57.311959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.313 #34 NEW cov: 12428 ft: 15393 corp: 24/496b lim: 40 exec/s: 34 rss: 73Mb L: 20/39 MS: 1 PersAutoDict- DE: "\004\000"- 00:07:44.313 [2024-11-17 08:19:57.372088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.313 [2024-11-17 08:19:57.372113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.313 [2024-11-17 08:19:57.372170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.313 [2024-11-17 08:19:57.372184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.313 #35 NEW cov: 12428 ft: 15406 corp: 25/519b lim: 40 exec/s: 35 rss: 73Mb L: 23/39 MS: 1 PersAutoDict- DE: "\004\000"- 00:07:44.313 [2024-11-17 08:19:57.412100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.313 [2024-11-17 08:19:57.412126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.313 [2024-11-17 08:19:57.412183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00feff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.313 [2024-11-17 08:19:57.412197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.313 #36 NEW cov: 12428 ft: 15430 corp: 26/542b lim: 40 exec/s: 36 rss: 73Mb L: 23/39 MS: 1 CrossOver- 00:07:44.572 [2024-11-17 08:19:57.452095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.572 [2024-11-17 08:19:57.452121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.572 #37 NEW cov: 12428 ft: 15507 corp: 27/555b lim: 40 exec/s: 37 rss: 73Mb L: 13/39 MS: 1 CrossOver- 00:07:44.572 [2024-11-17 08:19:57.492351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.572 [2024-11-17 08:19:57.492380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.572 [2024-11-17 08:19:57.492438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:04000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.572 [2024-11-17 08:19:57.492453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.572 #38 NEW cov: 12428 ft: 15525 corp: 28/572b lim: 40 exec/s: 38 rss: 73Mb L: 17/39 MS: 1 PersAutoDict- DE: "\004\000"- 00:07:44.572 [2024-11-17 08:19:57.552547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.572 [2024-11-17 08:19:57.552572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.572 [2024-11-17 08:19:57.552627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0400fe04 cdw11:00ff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.572 [2024-11-17 08:19:57.552641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.572 #39 NEW cov: 12428 ft: 15538 corp: 29/592b lim: 40 exec/s: 39 rss: 74Mb L: 20/39 MS: 1 PersAutoDict- DE: "\004\000"- 00:07:44.572 [2024-11-17 08:19:57.612722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0000009f cdw11:9f9f9f9f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.572 [2024-11-17 08:19:57.612748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.572 [2024-11-17 08:19:57.612806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:9f9f9f9f cdw11:9f9f9f9f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.573 [2024-11-17 08:19:57.612819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.573 #40 NEW cov: 12428 ft: 15543 corp: 30/615b lim: 40 exec/s: 40 rss: 74Mb L: 23/39 MS: 1 InsertRepeatedBytes- 00:07:44.573 [2024-11-17 08:19:57.672889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:fc000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.573 [2024-11-17 08:19:57.672915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.573 [2024-11-17 08:19:57.672971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.573 [2024-11-17 08:19:57.672985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.573 #41 NEW cov: 12428 ft: 15548 corp: 31/636b lim: 40 exec/s: 41 rss: 74Mb L: 21/39 MS: 1 ChangeBinInt- 00:07:44.832 [2024-11-17 08:19:57.712883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.832 [2024-11-17 08:19:57.712909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.832 #42 NEW cov: 12428 ft: 15586 corp: 32/646b lim: 40 exec/s: 42 rss: 74Mb L: 10/39 MS: 1 ChangeByte- 00:07:44.832 [2024-11-17 08:19:57.753107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.832 [2024-11-17 08:19:57.753133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.832 [2024-11-17 08:19:57.753186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.832 [2024-11-17 08:19:57.753204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.832 #43 NEW cov: 12428 ft: 15631 corp: 33/664b lim: 40 exec/s: 43 rss: 74Mb L: 18/39 MS: 1 EraseBytes- 00:07:44.832 [2024-11-17 08:19:57.813447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00008181 cdw11:81818181 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.832 [2024-11-17 08:19:57.813473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.832 [2024-11-17 08:19:57.813528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:81818100 cdw11:00fc0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.832 [2024-11-17 08:19:57.813543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.832 [2024-11-17 08:19:57.813597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.832 [2024-11-17 08:19:57.813611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.832 #44 NEW cov: 12428 ft: 15641 corp: 34/694b lim: 40 exec/s: 44 rss: 74Mb L: 30/39 MS: 1 InsertRepeatedBytes- 00:07:44.832 [2024-11-17 08:19:57.873830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.832 [2024-11-17 08:19:57.873856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.832 [2024-11-17 08:19:57.873909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000000d1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.832 [2024-11-17 08:19:57.873922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.832 [2024-11-17 08:19:57.873976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:d1000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.832 [2024-11-17 08:19:57.873990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.832 [2024-11-17 08:19:57.874043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0000d1d1 cdw11:d10000d1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.832 [2024-11-17 08:19:57.874057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.832 #45 NEW cov: 12428 ft: 15644 corp: 35/732b lim: 40 exec/s: 45 rss: 74Mb L: 38/39 MS: 1 CopyPart- 00:07:44.832 [2024-11-17 08:19:57.933627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.832 [2024-11-17 08:19:57.933652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.832 [2024-11-17 08:19:57.933714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.832 [2024-11-17 08:19:57.933729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.832 #46 NEW cov: 12428 ft: 15645 corp: 36/755b lim: 40 exec/s: 46 rss: 74Mb L: 23/39 MS: 1 ChangeByte- 00:07:45.093 [2024-11-17 08:19:57.974066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.093 [2024-11-17 08:19:57.974092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.093 [2024-11-17 08:19:57.974151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00009f9f cdw11:9f9f9f9f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.093 [2024-11-17 08:19:57.974165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.093 [2024-11-17 08:19:57.974219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:9f9f9f9f cdw11:9f9f9fff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.093 [2024-11-17 08:19:57.974233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.093 [2024-11-17 08:19:57.974288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.093 [2024-11-17 08:19:57.974302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.093 #47 NEW cov: 12428 ft: 15698 corp: 37/794b lim: 40 exec/s: 23 rss: 74Mb L: 39/39 MS: 1 CrossOver- 00:07:45.093 #47 DONE cov: 12428 ft: 15698 corp: 37/794b lim: 40 exec/s: 23 rss: 74Mb 00:07:45.093 ###### Recommended dictionary. ###### 00:07:45.093 "\004\000" # Uses: 6 00:07:45.093 ###### End of recommended dictionary. ###### 00:07:45.093 Done 47 runs in 2 second(s) 00:07:45.093 08:19:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_12.conf /var/tmp/suppress_nvmf_fuzz 00:07:45.093 08:19:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:45.093 08:19:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:45.093 08:19:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:07:45.093 08:19:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:07:45.093 08:19:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:45.093 08:19:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:45.093 08:19:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:45.093 08:19:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:07:45.093 08:19:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:45.093 08:19:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:45.093 08:19:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 13 00:07:45.093 08:19:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4413 00:07:45.093 08:19:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:45.093 08:19:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:07:45.093 08:19:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:45.093 08:19:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:45.093 08:19:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:45.093 08:19:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 00:07:45.093 [2024-11-17 08:19:58.157510] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:45.093 [2024-11-17 08:19:58.157581] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid995132 ] 00:07:45.353 [2024-11-17 08:19:58.341839] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.353 [2024-11-17 08:19:58.364070] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.353 [2024-11-17 08:19:58.416531] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:45.354 [2024-11-17 08:19:58.432846] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:07:45.354 INFO: Running with entropic power schedule (0xFF, 100). 00:07:45.354 INFO: Seed: 2476733350 00:07:45.354 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:45.354 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:45.354 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:45.354 INFO: A corpus is not provided, starting from an empty corpus 00:07:45.354 #2 INITED exec/s: 0 rss: 65Mb 00:07:45.354 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:45.354 This may also happen if the target rejected all inputs we tried so far 00:07:45.354 [2024-11-17 08:19:58.478658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:06ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.354 [2024-11-17 08:19:58.478688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.354 [2024-11-17 08:19:58.478752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.354 [2024-11-17 08:19:58.478766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.354 [2024-11-17 08:19:58.478826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.354 [2024-11-17 08:19:58.478841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.354 [2024-11-17 08:19:58.478897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.354 [2024-11-17 08:19:58.478910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.873 NEW_FUNC[1/714]: 0x46bbb8 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:07:45.873 NEW_FUNC[2/714]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:45.873 #11 NEW cov: 12189 ft: 12188 corp: 2/33b lim: 40 exec/s: 0 rss: 72Mb L: 32/32 MS: 4 ChangeBit-ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:07:45.873 [2024-11-17 08:19:58.788988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8fff03 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.873 [2024-11-17 08:19:58.789028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.873 #16 NEW cov: 12302 ft: 13529 corp: 3/43b lim: 40 exec/s: 0 rss: 72Mb L: 10/32 MS: 5 InsertByte-ChangeByte-ChangeBit-ShuffleBytes-CMP- DE: "\377\003\000\000\000\000\000\000"- 00:07:45.873 [2024-11-17 08:19:58.829382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:060affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.873 [2024-11-17 08:19:58.829408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.873 [2024-11-17 08:19:58.829464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.873 [2024-11-17 08:19:58.829478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.873 [2024-11-17 08:19:58.829533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.873 [2024-11-17 08:19:58.829550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.873 [2024-11-17 08:19:58.829605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.873 [2024-11-17 08:19:58.829618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.873 #17 NEW cov: 12308 ft: 13763 corp: 4/75b lim: 40 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 CrossOver- 00:07:45.873 [2024-11-17 08:19:58.889390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.873 [2024-11-17 08:19:58.889417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.873 [2024-11-17 08:19:58.889472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.873 [2024-11-17 08:19:58.889485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.873 [2024-11-17 08:19:58.889541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.873 [2024-11-17 08:19:58.889555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.873 #18 NEW cov: 12393 ft: 14217 corp: 5/104b lim: 40 exec/s: 0 rss: 72Mb L: 29/32 MS: 1 CrossOver- 00:07:45.873 [2024-11-17 08:19:58.929584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:060affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.873 [2024-11-17 08:19:58.929610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.873 [2024-11-17 08:19:58.929664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.873 [2024-11-17 08:19:58.929678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.873 [2024-11-17 08:19:58.929727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.873 [2024-11-17 08:19:58.929741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.873 [2024-11-17 08:19:58.929799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ff030000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.873 [2024-11-17 08:19:58.929813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.873 #19 NEW cov: 12393 ft: 14285 corp: 6/136b lim: 40 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 PersAutoDict- DE: "\377\003\000\000\000\000\000\000"- 00:07:45.873 [2024-11-17 08:19:58.989892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:060affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.873 [2024-11-17 08:19:58.989918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.873 [2024-11-17 08:19:58.989974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.873 [2024-11-17 08:19:58.989988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.873 [2024-11-17 08:19:58.990047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.873 [2024-11-17 08:19:58.990062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.873 [2024-11-17 08:19:58.990117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.873 [2024-11-17 08:19:58.990131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.873 [2024-11-17 08:19:58.990185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.873 [2024-11-17 08:19:58.990198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:46.131 #20 NEW cov: 12393 ft: 14420 corp: 7/176b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 PersAutoDict- DE: "\377\003\000\000\000\000\000\000"- 00:07:46.131 [2024-11-17 08:19:59.029625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.131 [2024-11-17 08:19:59.029651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.131 [2024-11-17 08:19:59.029709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.131 [2024-11-17 08:19:59.029723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.131 #21 NEW cov: 12393 ft: 14698 corp: 8/199b lim: 40 exec/s: 0 rss: 72Mb L: 23/40 MS: 1 InsertRepeatedBytes- 00:07:46.131 [2024-11-17 08:19:59.069924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0ac7c7c7 cdw11:c7c7c7c7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.131 [2024-11-17 08:19:59.069950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.131 [2024-11-17 08:19:59.070007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:c7c7c7c7 cdw11:c7c7c7c7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.131 [2024-11-17 08:19:59.070021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.131 [2024-11-17 08:19:59.070078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:c7c7c7c7 cdw11:c7c7ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.131 [2024-11-17 08:19:59.070092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.131 #24 NEW cov: 12393 ft: 14726 corp: 9/225b lim: 40 exec/s: 0 rss: 72Mb L: 26/40 MS: 3 InsertByte-CrossOver-InsertRepeatedBytes- 00:07:46.131 [2024-11-17 08:19:59.109873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:060affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.131 [2024-11-17 08:19:59.109899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.131 [2024-11-17 08:19:59.109955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.131 [2024-11-17 08:19:59.109969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.131 #25 NEW cov: 12393 ft: 14790 corp: 10/247b lim: 40 exec/s: 0 rss: 72Mb L: 22/40 MS: 1 EraseBytes- 00:07:46.131 [2024-11-17 08:19:59.169907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff030000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.131 [2024-11-17 08:19:59.169939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.131 #26 NEW cov: 12393 ft: 14844 corp: 11/256b lim: 40 exec/s: 0 rss: 72Mb L: 9/40 MS: 1 PersAutoDict- DE: "\377\003\000\000\000\000\000\000"- 00:07:46.131 [2024-11-17 08:19:59.210393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:060affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.131 [2024-11-17 08:19:59.210418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.131 [2024-11-17 08:19:59.210475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.131 [2024-11-17 08:19:59.210489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.132 [2024-11-17 08:19:59.210544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.132 [2024-11-17 08:19:59.210558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.132 [2024-11-17 08:19:59.210613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.132 [2024-11-17 08:19:59.210626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.132 #27 NEW cov: 12393 ft: 14855 corp: 12/289b lim: 40 exec/s: 0 rss: 72Mb L: 33/40 MS: 1 EraseBytes- 00:07:46.132 [2024-11-17 08:19:59.250599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:060affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.132 [2024-11-17 08:19:59.250624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.132 [2024-11-17 08:19:59.250681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.132 [2024-11-17 08:19:59.250700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.132 [2024-11-17 08:19:59.250757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.132 [2024-11-17 08:19:59.250771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.132 [2024-11-17 08:19:59.250826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffc7 cdw11:6c9d2f91 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.132 [2024-11-17 08:19:59.250839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.132 [2024-11-17 08:19:59.250895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:968a00ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.132 [2024-11-17 08:19:59.250908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:46.390 #28 NEW cov: 12393 ft: 14875 corp: 13/329b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 CMP- DE: "\307l\235/\221\226\212\000"- 00:07:46.390 [2024-11-17 08:19:59.290489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.390 [2024-11-17 08:19:59.290515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.390 [2024-11-17 08:19:59.290575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2bffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.390 [2024-11-17 08:19:59.290589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.390 [2024-11-17 08:19:59.290645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.390 [2024-11-17 08:19:59.290659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.390 #34 NEW cov: 12393 ft: 14889 corp: 14/359b lim: 40 exec/s: 0 rss: 73Mb L: 30/40 MS: 1 InsertByte- 00:07:46.390 [2024-11-17 08:19:59.350872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:060affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.390 [2024-11-17 08:19:59.350897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.390 [2024-11-17 08:19:59.350950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.390 [2024-11-17 08:19:59.350964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.390 [2024-11-17 08:19:59.351018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.390 [2024-11-17 08:19:59.351032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.390 [2024-11-17 08:19:59.351085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffc7 cdw11:6c9d2f91 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.390 [2024-11-17 08:19:59.351099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.390 [2024-11-17 08:19:59.351155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:6675ff00 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.390 [2024-11-17 08:19:59.351168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:46.390 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:46.390 #35 NEW cov: 12416 ft: 14930 corp: 15/399b lim: 40 exec/s: 0 rss: 73Mb L: 40/40 MS: 1 ChangeBinInt- 00:07:46.390 [2024-11-17 08:19:59.410916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.390 [2024-11-17 08:19:59.410942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.391 [2024-11-17 08:19:59.410997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.391 [2024-11-17 08:19:59.411011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.391 [2024-11-17 08:19:59.411067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffebeb cdw11:ebebebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.391 [2024-11-17 08:19:59.411080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.391 [2024-11-17 08:19:59.411135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ebebebeb cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.391 [2024-11-17 08:19:59.411152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.391 #36 NEW cov: 12416 ft: 14945 corp: 16/432b lim: 40 exec/s: 0 rss: 73Mb L: 33/40 MS: 1 InsertRepeatedBytes- 00:07:46.391 [2024-11-17 08:19:59.471123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.391 [2024-11-17 08:19:59.471150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.391 [2024-11-17 08:19:59.471206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.391 [2024-11-17 08:19:59.471220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.391 [2024-11-17 08:19:59.471277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.391 [2024-11-17 08:19:59.471291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.391 [2024-11-17 08:19:59.471344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.391 [2024-11-17 08:19:59.471357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.391 #40 NEW cov: 12416 ft: 14948 corp: 17/465b lim: 40 exec/s: 40 rss: 73Mb L: 33/40 MS: 4 ChangeBit-CopyPart-ChangeBinInt-InsertRepeatedBytes- 00:07:46.391 [2024-11-17 08:19:59.510962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:060ac76c cdw11:9d2f9196 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.391 [2024-11-17 08:19:59.510988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.391 [2024-11-17 08:19:59.511045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8a00ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.391 [2024-11-17 08:19:59.511059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.649 #41 NEW cov: 12416 ft: 14949 corp: 18/487b lim: 40 exec/s: 41 rss: 73Mb L: 22/40 MS: 1 PersAutoDict- DE: "\307l\235/\221\226\212\000"- 00:07:46.649 [2024-11-17 08:19:59.571099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:060ac76c cdw11:8a00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.649 [2024-11-17 08:19:59.571124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.649 [2024-11-17 08:19:59.571183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.649 [2024-11-17 08:19:59.571197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.649 #42 NEW cov: 12416 ft: 14958 corp: 19/509b lim: 40 exec/s: 42 rss: 73Mb L: 22/40 MS: 1 CopyPart- 00:07:46.649 [2024-11-17 08:19:59.631298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.649 [2024-11-17 08:19:59.631324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.649 [2024-11-17 08:19:59.631381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.649 [2024-11-17 08:19:59.631394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.649 #43 NEW cov: 12416 ft: 14967 corp: 20/531b lim: 40 exec/s: 43 rss: 73Mb L: 22/40 MS: 1 InsertRepeatedBytes- 00:07:46.649 [2024-11-17 08:19:59.671278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.649 [2024-11-17 08:19:59.671304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.649 #49 NEW cov: 12416 ft: 14975 corp: 21/540b lim: 40 exec/s: 49 rss: 73Mb L: 9/40 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:46.649 [2024-11-17 08:19:59.731592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:060ac76c cdw11:9d2f9196 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.649 [2024-11-17 08:19:59.731619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.649 [2024-11-17 08:19:59.731675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8a00ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.649 [2024-11-17 08:19:59.731689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.649 #50 NEW cov: 12416 ft: 15060 corp: 22/562b lim: 40 exec/s: 50 rss: 73Mb L: 22/40 MS: 1 ShuffleBytes- 00:07:46.649 [2024-11-17 08:19:59.771810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0ac7c7c7 cdw11:c7c7c7c7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.649 [2024-11-17 08:19:59.771836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.649 [2024-11-17 08:19:59.771893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:c7c7c7c7 cdw11:c7c7c7c7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.649 [2024-11-17 08:19:59.771907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.649 [2024-11-17 08:19:59.771962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000001a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.649 [2024-11-17 08:19:59.771976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.908 #51 NEW cov: 12416 ft: 15068 corp: 23/588b lim: 40 exec/s: 51 rss: 73Mb L: 26/40 MS: 1 ChangeBinInt- 00:07:46.908 [2024-11-17 08:19:59.832194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:060affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.908 [2024-11-17 08:19:59.832220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.908 [2024-11-17 08:19:59.832276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffefffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.908 [2024-11-17 08:19:59.832290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.908 [2024-11-17 08:19:59.832341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.908 [2024-11-17 08:19:59.832356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.908 [2024-11-17 08:19:59.832409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffc7 cdw11:6c9d2f91 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.908 [2024-11-17 08:19:59.832422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.908 [2024-11-17 08:19:59.832477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:968a00ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.908 [2024-11-17 08:19:59.832494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:46.908 #52 NEW cov: 12416 ft: 15087 corp: 24/628b lim: 40 exec/s: 52 rss: 73Mb L: 40/40 MS: 1 ChangeBit- 00:07:46.908 [2024-11-17 08:19:59.871967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:060ac76c cdw11:8a00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.908 [2024-11-17 08:19:59.871993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.908 [2024-11-17 08:19:59.872051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffffe SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.908 [2024-11-17 08:19:59.872065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.908 #53 NEW cov: 12416 ft: 15096 corp: 25/650b lim: 40 exec/s: 53 rss: 73Mb L: 22/40 MS: 1 ChangeBit- 00:07:46.908 [2024-11-17 08:19:59.931985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00770000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.908 [2024-11-17 08:19:59.932011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.908 #54 NEW cov: 12416 ft: 15106 corp: 26/659b lim: 40 exec/s: 54 rss: 73Mb L: 9/40 MS: 1 ChangeByte- 00:07:46.908 [2024-11-17 08:19:59.992132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.908 [2024-11-17 08:19:59.992159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.908 #55 NEW cov: 12416 ft: 15158 corp: 27/668b lim: 40 exec/s: 55 rss: 73Mb L: 9/40 MS: 1 ShuffleBytes- 00:07:46.908 [2024-11-17 08:20:00.032324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:008fff03 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.908 [2024-11-17 08:20:00.032353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.168 #56 NEW cov: 12416 ft: 15211 corp: 28/682b lim: 40 exec/s: 56 rss: 74Mb L: 14/40 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:47.168 [2024-11-17 08:20:00.092648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:060affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.168 [2024-11-17 08:20:00.092680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.168 [2024-11-17 08:20:00.092744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.168 [2024-11-17 08:20:00.092759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.168 #57 NEW cov: 12416 ft: 15216 corp: 29/704b lim: 40 exec/s: 57 rss: 74Mb L: 22/40 MS: 1 CopyPart- 00:07:47.168 [2024-11-17 08:20:00.132585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:06007700 cdw11:000affff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.168 [2024-11-17 08:20:00.132611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.168 #58 NEW cov: 12416 ft: 15226 corp: 30/719b lim: 40 exec/s: 58 rss: 74Mb L: 15/40 MS: 1 CrossOver- 00:07:47.168 [2024-11-17 08:20:00.172853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.168 [2024-11-17 08:20:00.172880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.168 [2024-11-17 08:20:00.172940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.168 [2024-11-17 08:20:00.172955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.168 #59 NEW cov: 12416 ft: 15227 corp: 31/738b lim: 40 exec/s: 59 rss: 74Mb L: 19/40 MS: 1 EraseBytes- 00:07:47.168 [2024-11-17 08:20:00.212807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:060077ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.168 [2024-11-17 08:20:00.212833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.168 #60 NEW cov: 12416 ft: 15237 corp: 32/753b lim: 40 exec/s: 60 rss: 74Mb L: 15/40 MS: 1 ShuffleBytes- 00:07:47.168 [2024-11-17 08:20:00.273511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:060affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.168 [2024-11-17 08:20:00.273537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.168 [2024-11-17 08:20:00.273593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.168 [2024-11-17 08:20:00.273607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.168 [2024-11-17 08:20:00.273673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.168 [2024-11-17 08:20:00.273687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.168 [2024-11-17 08:20:00.273748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffc7 cdw11:6c9d2f91 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.168 [2024-11-17 08:20:00.273762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.168 [2024-11-17 08:20:00.273816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:6675ff00 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.168 [2024-11-17 08:20:00.273830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.427 #61 NEW cov: 12416 ft: 15244 corp: 33/793b lim: 40 exec/s: 61 rss: 74Mb L: 40/40 MS: 1 ShuffleBytes- 00:07:47.427 [2024-11-17 08:20:00.333522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:06ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.427 [2024-11-17 08:20:00.333548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.427 [2024-11-17 08:20:00.333605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.427 [2024-11-17 08:20:00.333619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.427 [2024-11-17 08:20:00.333672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.427 [2024-11-17 08:20:00.333685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.427 [2024-11-17 08:20:00.333740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.427 [2024-11-17 08:20:00.333757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.427 #62 NEW cov: 12416 ft: 15269 corp: 34/825b lim: 40 exec/s: 62 rss: 74Mb L: 32/40 MS: 1 ShuffleBytes- 00:07:47.427 [2024-11-17 08:20:00.373343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.427 [2024-11-17 08:20:00.373368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.427 [2024-11-17 08:20:00.373424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffff0300 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.427 [2024-11-17 08:20:00.373438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.427 #63 NEW cov: 12416 ft: 15275 corp: 35/844b lim: 40 exec/s: 63 rss: 74Mb L: 19/40 MS: 1 PersAutoDict- DE: "\377\003\000\000\000\000\000\000"- 00:07:47.427 [2024-11-17 08:20:00.433610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:060afff7 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.427 [2024-11-17 08:20:00.433636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.427 [2024-11-17 08:20:00.433700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.427 [2024-11-17 08:20:00.433714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.427 #64 NEW cov: 12416 ft: 15288 corp: 36/866b lim: 40 exec/s: 64 rss: 74Mb L: 22/40 MS: 1 ChangeBinInt- 00:07:47.428 [2024-11-17 08:20:00.473533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00c76c9d cdw11:2f91968a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.428 [2024-11-17 08:20:00.473558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.428 #65 NEW cov: 12416 ft: 15302 corp: 37/875b lim: 40 exec/s: 32 rss: 74Mb L: 9/40 MS: 1 PersAutoDict- DE: "\307l\235/\221\226\212\000"- 00:07:47.428 #65 DONE cov: 12416 ft: 15302 corp: 37/875b lim: 40 exec/s: 32 rss: 74Mb 00:07:47.428 ###### Recommended dictionary. ###### 00:07:47.428 "\377\003\000\000\000\000\000\000" # Uses: 4 00:07:47.428 "\307l\235/\221\226\212\000" # Uses: 2 00:07:47.428 "\000\000\000\000" # Uses: 1 00:07:47.428 ###### End of recommended dictionary. ###### 00:07:47.428 Done 65 runs in 2 second(s) 00:07:47.687 08:20:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_13.conf /var/tmp/suppress_nvmf_fuzz 00:07:47.687 08:20:00 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:47.687 08:20:00 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:47.687 08:20:00 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:07:47.687 08:20:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:07:47.687 08:20:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:47.687 08:20:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:47.687 08:20:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:47.687 08:20:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:07:47.687 08:20:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:47.687 08:20:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:47.687 08:20:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 14 00:07:47.687 08:20:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4414 00:07:47.687 08:20:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:47.687 08:20:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:07:47.687 08:20:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:47.687 08:20:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:47.687 08:20:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:47.687 08:20:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 00:07:47.687 [2024-11-17 08:20:00.673962] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:47.687 [2024-11-17 08:20:00.674032] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid995657 ] 00:07:47.947 [2024-11-17 08:20:00.848638] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.947 [2024-11-17 08:20:00.870493] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.947 [2024-11-17 08:20:00.922917] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:47.947 [2024-11-17 08:20:00.939242] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:07:47.947 INFO: Running with entropic power schedule (0xFF, 100). 00:07:47.947 INFO: Seed: 688745270 00:07:47.947 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:47.947 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:47.947 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:47.947 INFO: A corpus is not provided, starting from an empty corpus 00:07:47.947 #2 INITED exec/s: 0 rss: 65Mb 00:07:47.947 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:47.947 This may also happen if the target rejected all inputs we tried so far 00:07:47.947 [2024-11-17 08:20:01.016431] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:5 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.947 [2024-11-17 08:20:01.016477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.947 [2024-11-17 08:20:01.016612] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:6 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.947 [2024-11-17 08:20:01.016632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.947 [2024-11-17 08:20:01.016780] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:7 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.947 [2024-11-17 08:20:01.016806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.207 NEW_FUNC[1/718]: 0x46d788 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:07:48.207 NEW_FUNC[2/718]: 0x48ecd8 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:48.207 #11 NEW cov: 12222 ft: 12223 corp: 2/33b lim: 35 exec/s: 0 rss: 72Mb L: 32/32 MS: 4 ShuffleBytes-CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:07:48.467 [2024-11-17 08:20:01.357277] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:5 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.467 [2024-11-17 08:20:01.357321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.467 [2024-11-17 08:20:01.357473] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:6 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.467 [2024-11-17 08:20:01.357498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.467 #12 NEW cov: 12353 ft: 13217 corp: 3/60b lim: 35 exec/s: 0 rss: 72Mb L: 27/32 MS: 1 CrossOver- 00:07:48.467 [2024-11-17 08:20:01.427345] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:5 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.467 [2024-11-17 08:20:01.427382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.467 [2024-11-17 08:20:01.427525] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:6 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.467 [2024-11-17 08:20:01.427551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.467 #13 NEW cov: 12359 ft: 13413 corp: 4/87b lim: 35 exec/s: 0 rss: 72Mb L: 27/32 MS: 1 CopyPart- 00:07:48.467 [2024-11-17 08:20:01.497587] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:5 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.467 [2024-11-17 08:20:01.497623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.467 [2024-11-17 08:20:01.497761] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:6 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.467 [2024-11-17 08:20:01.497790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.467 #14 NEW cov: 12444 ft: 13755 corp: 5/109b lim: 35 exec/s: 0 rss: 72Mb L: 22/32 MS: 1 EraseBytes- 00:07:48.467 [2024-11-17 08:20:01.547780] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES INTERRUPT COALESCING cid:5 cdw10:00000008 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.467 [2024-11-17 08:20:01.547818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE NOT CHANGEABLE (01/0e) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.467 [2024-11-17 08:20:01.547955] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES INTERRUPT COALESCING cid:6 cdw10:00000008 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.467 [2024-11-17 08:20:01.547981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE NOT CHANGEABLE (01/0e) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.467 NEW_FUNC[1/1]: 0x48d698 in feat_interrupt_coalescing /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:325 00:07:48.467 #15 NEW cov: 12468 ft: 13951 corp: 6/133b lim: 35 exec/s: 0 rss: 72Mb L: 24/32 MS: 1 InsertRepeatedBytes- 00:07:48.467 [2024-11-17 08:20:01.597850] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES INTERRUPT COALESCING cid:5 cdw10:00000008 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.467 [2024-11-17 08:20:01.597886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE NOT CHANGEABLE (01/0e) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.467 [2024-11-17 08:20:01.598022] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES INTERRUPT COALESCING cid:6 cdw10:00000008 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.467 [2024-11-17 08:20:01.598048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE NOT CHANGEABLE (01/0e) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.727 #16 NEW cov: 12468 ft: 14033 corp: 7/157b lim: 35 exec/s: 0 rss: 72Mb L: 24/32 MS: 1 ChangeBit- 00:07:48.727 [2024-11-17 08:20:01.668141] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:5 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.727 [2024-11-17 08:20:01.668177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.727 [2024-11-17 08:20:01.668338] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000016 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.727 [2024-11-17 08:20:01.668359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.727 #17 NEW cov: 12471 ft: 14242 corp: 8/179b lim: 35 exec/s: 0 rss: 73Mb L: 22/32 MS: 1 ChangeBinInt- 00:07:48.727 [2024-11-17 08:20:01.738652] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:5 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.727 [2024-11-17 08:20:01.738689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.727 [2024-11-17 08:20:01.738825] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:6 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.727 [2024-11-17 08:20:01.738845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.727 [2024-11-17 08:20:01.738982] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:7 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.727 [2024-11-17 08:20:01.739003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.727 #18 NEW cov: 12471 ft: 14271 corp: 9/213b lim: 35 exec/s: 0 rss: 73Mb L: 34/34 MS: 1 CopyPart- 00:07:48.727 [2024-11-17 08:20:01.788481] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:5 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.727 [2024-11-17 08:20:01.788518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.727 [2024-11-17 08:20:01.788660] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:6 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.727 [2024-11-17 08:20:01.788680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.727 #19 NEW cov: 12471 ft: 14279 corp: 10/235b lim: 35 exec/s: 0 rss: 73Mb L: 22/34 MS: 1 CopyPart- 00:07:48.727 [2024-11-17 08:20:01.838977] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:5 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.727 [2024-11-17 08:20:01.839012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.727 [2024-11-17 08:20:01.839254] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:7 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.727 [2024-11-17 08:20:01.839280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.727 #20 NEW cov: 12471 ft: 14340 corp: 11/266b lim: 35 exec/s: 0 rss: 73Mb L: 31/34 MS: 1 CopyPart- 00:07:48.986 [2024-11-17 08:20:01.888428] ctrlr.c:1865:nvmf_ctrlr_set_features_reservation_notification_mask: *ERROR*: Set Features - Invalid Namespace ID 00:07:48.986 [2024-11-17 08:20:01.889147] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:5 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-17 08:20:01.889183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.986 [2024-11-17 08:20:01.889330] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:6 cdw10:00000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-17 08:20:01.889349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.986 [2024-11-17 08:20:01.889492] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:7 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-17 08:20:01.889519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.986 NEW_FUNC[1/2]: 0x1362958 in nvmf_ctrlr_set_features_reservation_notification_mask /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1846 00:07:48.986 NEW_FUNC[2/2]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:48.986 #21 NEW cov: 12520 ft: 14430 corp: 12/294b lim: 35 exec/s: 0 rss: 73Mb L: 28/34 MS: 1 InsertByte- 00:07:48.986 [2024-11-17 08:20:01.969238] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES INTERRUPT COALESCING cid:5 cdw10:00000008 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-17 08:20:01.969273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE NOT CHANGEABLE (01/0e) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.986 [2024-11-17 08:20:01.969410] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES INTERRUPT COALESCING cid:6 cdw10:00000008 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-17 08:20:01.969436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE NOT CHANGEABLE (01/0e) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.986 #22 NEW cov: 12520 ft: 14501 corp: 13/319b lim: 35 exec/s: 22 rss: 73Mb L: 25/34 MS: 1 InsertByte- 00:07:48.986 [2024-11-17 08:20:02.039201] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:5 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-17 08:20:02.039230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.986 [2024-11-17 08:20:02.039374] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000016 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-17 08:20:02.039400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.986 #23 NEW cov: 12520 ft: 14550 corp: 14/341b lim: 35 exec/s: 23 rss: 73Mb L: 22/34 MS: 1 CopyPart- 00:07:48.986 [2024-11-17 08:20:02.109334] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000026 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-17 08:20:02.109361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.986 [2024-11-17 08:20:02.109493] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:5 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-17 08:20:02.109514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.986 [2024-11-17 08:20:02.109655] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000016 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.986 [2024-11-17 08:20:02.109682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.245 #24 NEW cov: 12526 ft: 14611 corp: 15/363b lim: 35 exec/s: 24 rss: 73Mb L: 22/34 MS: 1 ChangeByte- 00:07:49.245 [2024-11-17 08:20:02.179716] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:5 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.245 [2024-11-17 08:20:02.179755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.245 [2024-11-17 08:20:02.179917] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:6 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.245 [2024-11-17 08:20:02.179943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.245 #25 NEW cov: 12526 ft: 14642 corp: 16/384b lim: 35 exec/s: 25 rss: 73Mb L: 21/34 MS: 1 EraseBytes- 00:07:49.245 #26 NEW cov: 12526 ft: 15246 corp: 17/391b lim: 35 exec/s: 26 rss: 73Mb L: 7/34 MS: 1 CrossOver- 00:07:49.245 [2024-11-17 08:20:02.300354] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:5 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.245 [2024-11-17 08:20:02.300394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.245 [2024-11-17 08:20:02.300563] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:6 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.245 [2024-11-17 08:20:02.300586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.245 [2024-11-17 08:20:02.300747] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000001f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.245 [2024-11-17 08:20:02.300766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.245 #27 NEW cov: 12526 ft: 15263 corp: 18/425b lim: 35 exec/s: 27 rss: 73Mb L: 34/34 MS: 1 CMP- DE: "\037\000"- 00:07:49.245 [2024-11-17 08:20:02.350149] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.245 [2024-11-17 08:20:02.350181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.245 [2024-11-17 08:20:02.350341] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:6 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.245 [2024-11-17 08:20:02.350367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.504 #33 NEW cov: 12526 ft: 15269 corp: 19/449b lim: 35 exec/s: 33 rss: 73Mb L: 24/34 MS: 1 PersAutoDict- DE: "\037\000"- 00:07:49.504 [2024-11-17 08:20:02.420362] ctrlr.c:1865:nvmf_ctrlr_set_features_reservation_notification_mask: *ERROR*: Set Features - Invalid Namespace ID 00:07:49.504 [2024-11-17 08:20:02.421102] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:5 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-17 08:20:02.421139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.504 [2024-11-17 08:20:02.421278] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:6 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-17 08:20:02.421305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.504 [2024-11-17 08:20:02.421445] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:7 cdw10:00000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-17 08:20:02.421464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.504 [2024-11-17 08:20:02.421595] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:8 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-17 08:20:02.421621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.504 #34 NEW cov: 12526 ft: 15496 corp: 20/484b lim: 35 exec/s: 34 rss: 74Mb L: 35/35 MS: 1 CopyPart- 00:07:49.504 [2024-11-17 08:20:02.490673] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-17 08:20:02.490708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.504 [2024-11-17 08:20:02.490840] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:6 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-17 08:20:02.490864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.504 #35 NEW cov: 12526 ft: 15554 corp: 21/508b lim: 35 exec/s: 35 rss: 74Mb L: 24/35 MS: 1 ChangeByte- 00:07:49.504 [2024-11-17 08:20:02.560811] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:5 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-17 08:20:02.560850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.504 [2024-11-17 08:20:02.561007] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:6 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-17 08:20:02.561031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.504 #36 NEW cov: 12526 ft: 15598 corp: 22/535b lim: 35 exec/s: 36 rss: 74Mb L: 27/35 MS: 1 ChangeBit- 00:07:49.504 [2024-11-17 08:20:02.610710] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:5 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-17 08:20:02.610745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.504 #37 NEW cov: 12526 ft: 15809 corp: 23/553b lim: 35 exec/s: 37 rss: 74Mb L: 18/35 MS: 1 CrossOver- 00:07:49.763 [2024-11-17 08:20:02.661182] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:5 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.763 [2024-11-17 08:20:02.661218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.763 [2024-11-17 08:20:02.661351] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:6 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.763 [2024-11-17 08:20:02.661376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.763 #38 NEW cov: 12526 ft: 15820 corp: 24/574b lim: 35 exec/s: 38 rss: 74Mb L: 21/35 MS: 1 CrossOver- 00:07:49.763 [2024-11-17 08:20:02.710896] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.763 [2024-11-17 08:20:02.710927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.763 [2024-11-17 08:20:02.711066] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000a2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.763 [2024-11-17 08:20:02.711089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.763 #39 NEW cov: 12526 ft: 15873 corp: 25/594b lim: 35 exec/s: 39 rss: 74Mb L: 20/35 MS: 1 InsertRepeatedBytes- 00:07:49.763 [2024-11-17 08:20:02.781927] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:5 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.763 [2024-11-17 08:20:02.781963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.763 [2024-11-17 08:20:02.782214] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000b6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.763 [2024-11-17 08:20:02.782237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.763 #40 NEW cov: 12526 ft: 15885 corp: 26/625b lim: 35 exec/s: 40 rss: 74Mb L: 31/35 MS: 1 ChangeByte- 00:07:49.763 [2024-11-17 08:20:02.841443] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:5 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.763 [2024-11-17 08:20:02.841478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.763 #41 NEW cov: 12526 ft: 15892 corp: 27/643b lim: 35 exec/s: 41 rss: 74Mb L: 18/35 MS: 1 ChangeBit- 00:07:50.023 [2024-11-17 08:20:02.911921] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.023 [2024-11-17 08:20:02.911949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.023 [2024-11-17 08:20:02.912089] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST RESERVE MASK cid:6 cdw10:80000082 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.023 [2024-11-17 08:20:02.912108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.023 #42 NEW cov: 12526 ft: 15900 corp: 28/667b lim: 35 exec/s: 42 rss: 74Mb L: 24/35 MS: 1 PersAutoDict- DE: "\037\000"- 00:07:50.023 [2024-11-17 08:20:02.961996] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES INTERRUPT COALESCING cid:4 cdw10:00000008 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.023 [2024-11-17 08:20:02.962030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE NOT CHANGEABLE (01/0e) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.023 [2024-11-17 08:20:02.962163] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES INTERRUPT COALESCING cid:5 cdw10:00000008 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.023 [2024-11-17 08:20:02.962188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE NOT CHANGEABLE (01/0e) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.023 [2024-11-17 08:20:02.962331] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES INTERRUPT COALESCING cid:6 cdw10:00000008 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.023 [2024-11-17 08:20:02.962355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE NOT CHANGEABLE (01/0e) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.023 #43 NEW cov: 12526 ft: 15933 corp: 29/690b lim: 35 exec/s: 21 rss: 74Mb L: 23/35 MS: 1 EraseBytes- 00:07:50.023 #43 DONE cov: 12526 ft: 15933 corp: 29/690b lim: 35 exec/s: 21 rss: 74Mb 00:07:50.023 ###### Recommended dictionary. ###### 00:07:50.023 "\037\000" # Uses: 3 00:07:50.023 ###### End of recommended dictionary. ###### 00:07:50.023 Done 43 runs in 2 second(s) 00:07:50.023 08:20:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_14.conf /var/tmp/suppress_nvmf_fuzz 00:07:50.023 08:20:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:50.023 08:20:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:50.023 08:20:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:07:50.023 08:20:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:07:50.023 08:20:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:50.023 08:20:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:50.023 08:20:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:50.023 08:20:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:07:50.023 08:20:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:50.023 08:20:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:50.023 08:20:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 15 00:07:50.023 08:20:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4415 00:07:50.023 08:20:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:50.023 08:20:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:07:50.023 08:20:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:50.023 08:20:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:50.023 08:20:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:50.023 08:20:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 00:07:50.023 [2024-11-17 08:20:03.149828] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:50.023 [2024-11-17 08:20:03.149897] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid996115 ] 00:07:50.282 [2024-11-17 08:20:03.334265] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.282 [2024-11-17 08:20:03.356549] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.282 [2024-11-17 08:20:03.409263] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:50.541 [2024-11-17 08:20:03.425553] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:07:50.541 INFO: Running with entropic power schedule (0xFF, 100). 00:07:50.541 INFO: Seed: 3173193526 00:07:50.541 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:50.541 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:50.541 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:50.541 INFO: A corpus is not provided, starting from an empty corpus 00:07:50.541 #2 INITED exec/s: 0 rss: 65Mb 00:07:50.541 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:50.541 This may also happen if the target rejected all inputs we tried so far 00:07:50.541 [2024-11-17 08:20:03.474377] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.541 [2024-11-17 08:20:03.474406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.800 NEW_FUNC[1/714]: 0x46ecc8 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:07:50.800 NEW_FUNC[2/714]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:50.800 #4 NEW cov: 12171 ft: 12170 corp: 2/11b lim: 35 exec/s: 0 rss: 72Mb L: 10/10 MS: 2 CrossOver-CMP- DE: "\000\000\000\000\000\000\000."- 00:07:50.800 [2024-11-17 08:20:03.805157] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.800 [2024-11-17 08:20:03.805188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.800 [2024-11-17 08:20:03.805249] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000037 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.800 [2024-11-17 08:20:03.805263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.800 #5 NEW cov: 12284 ft: 13094 corp: 3/26b lim: 35 exec/s: 0 rss: 72Mb L: 15/15 MS: 1 InsertRepeatedBytes- 00:07:50.800 [2024-11-17 08:20:03.865115] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.800 [2024-11-17 08:20:03.865140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.800 #6 NEW cov: 12290 ft: 13271 corp: 4/37b lim: 35 exec/s: 0 rss: 72Mb L: 11/15 MS: 1 EraseBytes- 00:07:50.800 [2024-11-17 08:20:03.925255] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.800 [2024-11-17 08:20:03.925280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.059 #7 NEW cov: 12375 ft: 13508 corp: 5/48b lim: 35 exec/s: 0 rss: 72Mb L: 11/15 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000."- 00:07:51.059 NEW_FUNC[1/1]: 0x48ecd8 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:51.059 #8 NEW cov: 12389 ft: 13808 corp: 6/58b lim: 35 exec/s: 0 rss: 72Mb L: 10/15 MS: 1 CrossOver- 00:07:51.059 #13 NEW cov: 12389 ft: 13893 corp: 7/65b lim: 35 exec/s: 0 rss: 72Mb L: 7/15 MS: 5 CopyPart-InsertRepeatedBytes-ChangeByte-ChangeByte-InsertByte- 00:07:51.059 [2024-11-17 08:20:04.065601] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.059 [2024-11-17 08:20:04.065627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.059 #14 NEW cov: 12389 ft: 13952 corp: 8/76b lim: 35 exec/s: 0 rss: 72Mb L: 11/15 MS: 1 ChangeByte- 00:07:51.059 [2024-11-17 08:20:04.105739] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.059 [2024-11-17 08:20:04.105764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.059 #15 NEW cov: 12389 ft: 14012 corp: 9/87b lim: 35 exec/s: 0 rss: 72Mb L: 11/15 MS: 1 ChangeByte- 00:07:51.059 [2024-11-17 08:20:04.166016] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.059 [2024-11-17 08:20:04.166042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.059 [2024-11-17 08:20:04.166098] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000002e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.059 [2024-11-17 08:20:04.166112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.318 #16 NEW cov: 12389 ft: 14041 corp: 10/107b lim: 35 exec/s: 0 rss: 72Mb L: 20/20 MS: 1 CrossOver- 00:07:51.318 [2024-11-17 08:20:04.226067] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.318 [2024-11-17 08:20:04.226092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.318 #17 NEW cov: 12389 ft: 14082 corp: 11/118b lim: 35 exec/s: 0 rss: 72Mb L: 11/20 MS: 1 ShuffleBytes- 00:07:51.318 [2024-11-17 08:20:04.266210] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.318 [2024-11-17 08:20:04.266237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.318 #18 NEW cov: 12389 ft: 14095 corp: 12/129b lim: 35 exec/s: 0 rss: 72Mb L: 11/20 MS: 1 ChangeASCIIInt- 00:07:51.318 [2024-11-17 08:20:04.306294] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.318 [2024-11-17 08:20:04.306320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.318 #19 NEW cov: 12389 ft: 14121 corp: 13/140b lim: 35 exec/s: 0 rss: 72Mb L: 11/20 MS: 1 ChangeBinInt- 00:07:51.318 [2024-11-17 08:20:04.346544] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.318 [2024-11-17 08:20:04.346571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.318 [2024-11-17 08:20:04.346631] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.318 [2024-11-17 08:20:04.346645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.318 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:51.318 #20 NEW cov: 12412 ft: 14172 corp: 14/159b lim: 35 exec/s: 0 rss: 73Mb L: 19/20 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000."- 00:07:51.318 [2024-11-17 08:20:04.406592] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.318 [2024-11-17 08:20:04.406618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.318 #21 NEW cov: 12412 ft: 14194 corp: 15/169b lim: 35 exec/s: 0 rss: 73Mb L: 10/20 MS: 1 ShuffleBytes- 00:07:51.318 [2024-11-17 08:20:04.446820] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.318 [2024-11-17 08:20:04.446846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.318 [2024-11-17 08:20:04.446903] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.318 [2024-11-17 08:20:04.446916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.577 #22 NEW cov: 12412 ft: 14202 corp: 16/184b lim: 35 exec/s: 22 rss: 73Mb L: 15/20 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000."- 00:07:51.577 [2024-11-17 08:20:04.486978] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.577 [2024-11-17 08:20:04.487006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.577 [2024-11-17 08:20:04.487079] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.577 [2024-11-17 08:20:04.487093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.577 #23 NEW cov: 12412 ft: 14240 corp: 17/201b lim: 35 exec/s: 23 rss: 73Mb L: 17/20 MS: 1 EraseBytes- 00:07:51.577 [2024-11-17 08:20:04.547382] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.577 [2024-11-17 08:20:04.547408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.577 [2024-11-17 08:20:04.547467] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.577 [2024-11-17 08:20:04.547482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.577 [2024-11-17 08:20:04.547540] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000012e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.577 [2024-11-17 08:20:04.547554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.577 [2024-11-17 08:20:04.547611] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.577 [2024-11-17 08:20:04.547624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.577 #24 NEW cov: 12412 ft: 14796 corp: 18/229b lim: 35 exec/s: 24 rss: 73Mb L: 28/28 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:07:51.577 [2024-11-17 08:20:04.587283] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.577 [2024-11-17 08:20:04.587308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.577 #25 NEW cov: 12412 ft: 14840 corp: 19/248b lim: 35 exec/s: 25 rss: 73Mb L: 19/28 MS: 1 InsertRepeatedBytes- 00:07:51.577 [2024-11-17 08:20:04.647398] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.577 [2024-11-17 08:20:04.647423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.577 [2024-11-17 08:20:04.647482] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.577 [2024-11-17 08:20:04.647496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.577 #26 NEW cov: 12412 ft: 14854 corp: 20/265b lim: 35 exec/s: 26 rss: 73Mb L: 17/28 MS: 1 CrossOver- 00:07:51.577 [2024-11-17 08:20:04.687362] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.577 [2024-11-17 08:20:04.687387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.577 #27 NEW cov: 12412 ft: 14882 corp: 21/275b lim: 35 exec/s: 27 rss: 73Mb L: 10/28 MS: 1 EraseBytes- 00:07:51.835 [2024-11-17 08:20:04.727480] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.835 [2024-11-17 08:20:04.727505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.835 #28 NEW cov: 12412 ft: 14924 corp: 22/285b lim: 35 exec/s: 28 rss: 73Mb L: 10/28 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:51.836 [2024-11-17 08:20:04.767594] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.836 [2024-11-17 08:20:04.767619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.836 #29 NEW cov: 12412 ft: 14931 corp: 23/297b lim: 35 exec/s: 29 rss: 73Mb L: 12/28 MS: 1 CopyPart- 00:07:51.836 [2024-11-17 08:20:04.827753] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.836 [2024-11-17 08:20:04.827778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.836 #30 NEW cov: 12412 ft: 14971 corp: 24/308b lim: 35 exec/s: 30 rss: 73Mb L: 11/28 MS: 1 ShuffleBytes- 00:07:51.836 [2024-11-17 08:20:04.888091] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.836 [2024-11-17 08:20:04.888116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.836 [2024-11-17 08:20:04.888189] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000002e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.836 [2024-11-17 08:20:04.888204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.836 #31 NEW cov: 12412 ft: 14991 corp: 25/328b lim: 35 exec/s: 31 rss: 73Mb L: 20/28 MS: 1 ChangeBinInt- 00:07:51.836 [2024-11-17 08:20:04.928444] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.836 [2024-11-17 08:20:04.928468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.836 [2024-11-17 08:20:04.928540] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000137 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.836 [2024-11-17 08:20:04.928554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.836 [2024-11-17 08:20:04.928609] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.836 [2024-11-17 08:20:04.928623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.836 [2024-11-17 08:20:04.928676] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.836 [2024-11-17 08:20:04.928698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.836 #32 NEW cov: 12412 ft: 14998 corp: 26/359b lim: 35 exec/s: 32 rss: 73Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:07:52.095 [2024-11-17 08:20:04.988279] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.095 [2024-11-17 08:20:04.988303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.095 #33 NEW cov: 12412 ft: 15069 corp: 27/369b lim: 35 exec/s: 33 rss: 73Mb L: 10/31 MS: 1 ChangeBit- 00:07:52.095 [2024-11-17 08:20:05.048524] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.095 [2024-11-17 08:20:05.048549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.095 [2024-11-17 08:20:05.048607] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000002e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.095 [2024-11-17 08:20:05.048621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.095 #34 NEW cov: 12412 ft: 15075 corp: 28/389b lim: 35 exec/s: 34 rss: 73Mb L: 20/31 MS: 1 ChangeBinInt- 00:07:52.095 #35 NEW cov: 12412 ft: 15082 corp: 29/399b lim: 35 exec/s: 35 rss: 73Mb L: 10/31 MS: 1 ChangeBit- 00:07:52.095 [2024-11-17 08:20:05.148677] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.095 [2024-11-17 08:20:05.148708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.095 #36 NEW cov: 12412 ft: 15161 corp: 30/407b lim: 35 exec/s: 36 rss: 73Mb L: 8/31 MS: 1 EraseBytes- 00:07:52.095 #37 NEW cov: 12412 ft: 15167 corp: 31/415b lim: 35 exec/s: 37 rss: 73Mb L: 8/31 MS: 1 InsertByte- 00:07:52.095 [2024-11-17 08:20:05.228910] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.095 [2024-11-17 08:20:05.228935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.355 #38 NEW cov: 12412 ft: 15172 corp: 32/425b lim: 35 exec/s: 38 rss: 73Mb L: 10/31 MS: 1 EraseBytes- 00:07:52.355 [2024-11-17 08:20:05.269225] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.355 [2024-11-17 08:20:05.269250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.355 [2024-11-17 08:20:05.269322] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.355 [2024-11-17 08:20:05.269336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.355 [2024-11-17 08:20:05.269392] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.355 [2024-11-17 08:20:05.269405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.355 #39 NEW cov: 12412 ft: 15332 corp: 33/449b lim: 35 exec/s: 39 rss: 73Mb L: 24/31 MS: 1 InsertRepeatedBytes- 00:07:52.355 [2024-11-17 08:20:05.329658] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.355 [2024-11-17 08:20:05.329683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.355 [2024-11-17 08:20:05.329739] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000137 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.355 [2024-11-17 08:20:05.329756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.355 [2024-11-17 08:20:05.329810] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.355 [2024-11-17 08:20:05.329823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.355 [2024-11-17 08:20:05.329880] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.355 [2024-11-17 08:20:05.329893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.355 [2024-11-17 08:20:05.329946] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.355 [2024-11-17 08:20:05.329958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.355 #40 NEW cov: 12412 ft: 15382 corp: 34/484b lim: 35 exec/s: 40 rss: 74Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:52.355 [2024-11-17 08:20:05.389482] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.355 [2024-11-17 08:20:05.389506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.355 #41 NEW cov: 12412 ft: 15458 corp: 35/502b lim: 35 exec/s: 41 rss: 74Mb L: 18/35 MS: 1 CopyPart- 00:07:52.355 [2024-11-17 08:20:05.449604] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.355 [2024-11-17 08:20:05.449629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.355 [2024-11-17 08:20:05.449704] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.355 [2024-11-17 08:20:05.449719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.355 #42 NEW cov: 12412 ft: 15459 corp: 36/521b lim: 35 exec/s: 21 rss: 74Mb L: 19/35 MS: 1 CrossOver- 00:07:52.355 #42 DONE cov: 12412 ft: 15459 corp: 36/521b lim: 35 exec/s: 21 rss: 74Mb 00:07:52.355 ###### Recommended dictionary. ###### 00:07:52.355 "\000\000\000\000\000\000\000." # Uses: 3 00:07:52.355 "\000\000\000\000\000\000\000\000" # Uses: 1 00:07:52.355 ###### End of recommended dictionary. ###### 00:07:52.355 Done 42 runs in 2 second(s) 00:07:52.615 08:20:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_15.conf /var/tmp/suppress_nvmf_fuzz 00:07:52.615 08:20:05 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:52.615 08:20:05 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:52.615 08:20:05 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:07:52.615 08:20:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:07:52.615 08:20:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:52.615 08:20:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:52.615 08:20:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:52.615 08:20:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:07:52.615 08:20:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:52.615 08:20:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:52.615 08:20:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 16 00:07:52.615 08:20:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4416 00:07:52.615 08:20:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:52.615 08:20:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:07:52.615 08:20:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:52.615 08:20:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:52.615 08:20:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:52.616 08:20:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 00:07:52.616 [2024-11-17 08:20:05.651037] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:52.616 [2024-11-17 08:20:05.651109] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid996478 ] 00:07:52.875 [2024-11-17 08:20:05.832923] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.875 [2024-11-17 08:20:05.854613] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.875 [2024-11-17 08:20:05.907069] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:52.875 [2024-11-17 08:20:05.923391] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:07:52.875 INFO: Running with entropic power schedule (0xFF, 100). 00:07:52.875 INFO: Seed: 1376787384 00:07:52.875 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:52.875 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:52.875 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:52.875 INFO: A corpus is not provided, starting from an empty corpus 00:07:52.875 #2 INITED exec/s: 0 rss: 65Mb 00:07:52.875 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:52.875 This may also happen if the target rejected all inputs we tried so far 00:07:52.875 [2024-11-17 08:20:05.999712] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7016996767670100321 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.875 [2024-11-17 08:20:05.999749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.875 [2024-11-17 08:20:05.999863] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.875 [2024-11-17 08:20:05.999886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.875 [2024-11-17 08:20:06.000011] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.875 [2024-11-17 08:20:06.000032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.394 NEW_FUNC[1/715]: 0x470188 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:07:53.394 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:53.394 #21 NEW cov: 12274 ft: 12276 corp: 2/84b lim: 105 exec/s: 0 rss: 72Mb L: 83/83 MS: 4 InsertByte-ChangeBinInt-CrossOver-InsertRepeatedBytes- 00:07:53.394 [2024-11-17 08:20:06.340958] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.394 [2024-11-17 08:20:06.340999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.394 [2024-11-17 08:20:06.341136] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.394 [2024-11-17 08:20:06.341161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.394 [2024-11-17 08:20:06.341308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.394 [2024-11-17 08:20:06.341337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.394 #32 NEW cov: 12388 ft: 12960 corp: 3/167b lim: 105 exec/s: 0 rss: 72Mb L: 83/83 MS: 1 ChangeBinInt- 00:07:53.394 [2024-11-17 08:20:06.410917] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.394 [2024-11-17 08:20:06.410953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.394 [2024-11-17 08:20:06.411057] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.394 [2024-11-17 08:20:06.411081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.394 [2024-11-17 08:20:06.411205] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.394 [2024-11-17 08:20:06.411231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.394 #33 NEW cov: 12394 ft: 13239 corp: 4/250b lim: 105 exec/s: 0 rss: 72Mb L: 83/83 MS: 1 ShuffleBytes- 00:07:53.394 [2024-11-17 08:20:06.481378] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.394 [2024-11-17 08:20:06.481412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.394 [2024-11-17 08:20:06.481511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.394 [2024-11-17 08:20:06.481537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.394 [2024-11-17 08:20:06.481681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.394 [2024-11-17 08:20:06.481704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.394 [2024-11-17 08:20:06.481835] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.394 [2024-11-17 08:20:06.481861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.394 #34 NEW cov: 12479 ft: 14029 corp: 5/339b lim: 105 exec/s: 0 rss: 72Mb L: 89/89 MS: 1 CrossOver- 00:07:53.394 [2024-11-17 08:20:06.531357] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.394 [2024-11-17 08:20:06.531390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.394 [2024-11-17 08:20:06.531521] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.394 [2024-11-17 08:20:06.531544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.394 [2024-11-17 08:20:06.531687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.394 [2024-11-17 08:20:06.531718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.655 #35 NEW cov: 12479 ft: 14175 corp: 6/405b lim: 105 exec/s: 0 rss: 72Mb L: 66/89 MS: 1 EraseBytes- 00:07:53.655 [2024-11-17 08:20:06.601460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.655 [2024-11-17 08:20:06.601501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.655 [2024-11-17 08:20:06.601622] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.655 [2024-11-17 08:20:06.601650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.655 [2024-11-17 08:20:06.601792] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.655 [2024-11-17 08:20:06.601819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.655 #36 NEW cov: 12479 ft: 14262 corp: 7/487b lim: 105 exec/s: 0 rss: 72Mb L: 82/89 MS: 1 EraseBytes- 00:07:53.655 [2024-11-17 08:20:06.651929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7016996767670100321 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.655 [2024-11-17 08:20:06.651965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.655 [2024-11-17 08:20:06.652071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.655 [2024-11-17 08:20:06.652096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.655 [2024-11-17 08:20:06.652222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.655 [2024-11-17 08:20:06.652248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.655 [2024-11-17 08:20:06.652375] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7016996765293437281 len:24926 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.655 [2024-11-17 08:20:06.652399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.655 #37 NEW cov: 12479 ft: 14392 corp: 8/585b lim: 105 exec/s: 0 rss: 72Mb L: 98/98 MS: 1 InsertRepeatedBytes- 00:07:53.655 [2024-11-17 08:20:06.702079] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7016996767670100321 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.655 [2024-11-17 08:20:06.702113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.655 [2024-11-17 08:20:06.702200] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.655 [2024-11-17 08:20:06.702226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.655 [2024-11-17 08:20:06.702357] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:3856 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.655 [2024-11-17 08:20:06.702386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.655 [2024-11-17 08:20:06.702518] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7016996763912310543 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.655 [2024-11-17 08:20:06.702545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.655 #38 NEW cov: 12479 ft: 14432 corp: 9/683b lim: 105 exec/s: 0 rss: 72Mb L: 98/98 MS: 1 InsertRepeatedBytes- 00:07:53.655 [2024-11-17 08:20:06.752025] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.655 [2024-11-17 08:20:06.752064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.655 [2024-11-17 08:20:06.752169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.655 [2024-11-17 08:20:06.752194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.655 [2024-11-17 08:20:06.752324] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.655 [2024-11-17 08:20:06.752349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.655 #39 NEW cov: 12479 ft: 14518 corp: 10/766b lim: 105 exec/s: 0 rss: 72Mb L: 83/98 MS: 1 ShuffleBytes- 00:07:53.915 [2024-11-17 08:20:06.802190] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7016996767670100321 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.915 [2024-11-17 08:20:06.802229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.915 [2024-11-17 08:20:06.802341] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.915 [2024-11-17 08:20:06.802366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.915 [2024-11-17 08:20:06.802508] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.915 [2024-11-17 08:20:06.802537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.915 #40 NEW cov: 12479 ft: 14560 corp: 11/849b lim: 105 exec/s: 0 rss: 72Mb L: 83/98 MS: 1 ShuffleBytes- 00:07:53.915 [2024-11-17 08:20:06.872673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.915 [2024-11-17 08:20:06.872710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.915 [2024-11-17 08:20:06.872849] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.915 [2024-11-17 08:20:06.872876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.915 [2024-11-17 08:20:06.873008] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.915 [2024-11-17 08:20:06.873033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.915 [2024-11-17 08:20:06.873174] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.915 [2024-11-17 08:20:06.873203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.915 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:53.915 #41 NEW cov: 12502 ft: 14601 corp: 12/949b lim: 105 exec/s: 0 rss: 73Mb L: 100/100 MS: 1 CrossOver- 00:07:53.915 [2024-11-17 08:20:06.932792] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.915 [2024-11-17 08:20:06.932827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.915 [2024-11-17 08:20:06.932932] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.915 [2024-11-17 08:20:06.932952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.915 [2024-11-17 08:20:06.933079] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.915 [2024-11-17 08:20:06.933104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.915 [2024-11-17 08:20:06.933231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.915 [2024-11-17 08:20:06.933256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.915 #42 NEW cov: 12502 ft: 14625 corp: 13/1040b lim: 105 exec/s: 0 rss: 73Mb L: 91/100 MS: 1 CrossOver- 00:07:53.915 [2024-11-17 08:20:06.982704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.915 [2024-11-17 08:20:06.982739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.915 [2024-11-17 08:20:06.982860] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.915 [2024-11-17 08:20:06.982884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.915 [2024-11-17 08:20:06.983022] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1633771776 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.915 [2024-11-17 08:20:06.983050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.915 #43 NEW cov: 12502 ft: 14699 corp: 14/1115b lim: 105 exec/s: 43 rss: 73Mb L: 75/100 MS: 1 InsertRepeatedBytes- 00:07:53.915 [2024-11-17 08:20:07.052803] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.915 [2024-11-17 08:20:07.052837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.915 [2024-11-17 08:20:07.052978] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.915 [2024-11-17 08:20:07.053004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.244 #44 NEW cov: 12502 ft: 15045 corp: 15/1160b lim: 105 exec/s: 44 rss: 73Mb L: 45/100 MS: 1 EraseBytes- 00:07:54.244 [2024-11-17 08:20:07.123154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7016996767670100321 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.244 [2024-11-17 08:20:07.123193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.244 [2024-11-17 08:20:07.123325] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.244 [2024-11-17 08:20:07.123352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.244 [2024-11-17 08:20:07.123506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.244 [2024-11-17 08:20:07.123526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.244 #45 NEW cov: 12502 ft: 15054 corp: 16/1243b lim: 105 exec/s: 45 rss: 73Mb L: 83/100 MS: 1 CopyPart- 00:07:54.244 [2024-11-17 08:20:07.193579] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.244 [2024-11-17 08:20:07.193610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.244 [2024-11-17 08:20:07.193716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.244 [2024-11-17 08:20:07.193744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.244 [2024-11-17 08:20:07.193877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.244 [2024-11-17 08:20:07.193905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.244 [2024-11-17 08:20:07.194041] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.244 [2024-11-17 08:20:07.194068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.244 #46 NEW cov: 12502 ft: 15124 corp: 17/1343b lim: 105 exec/s: 46 rss: 73Mb L: 100/100 MS: 1 ChangeBinInt- 00:07:54.244 [2024-11-17 08:20:07.263826] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7016996767670100321 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.244 [2024-11-17 08:20:07.263859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.244 [2024-11-17 08:20:07.263973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.244 [2024-11-17 08:20:07.264002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.244 [2024-11-17 08:20:07.264138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:3856 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.244 [2024-11-17 08:20:07.264163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.244 [2024-11-17 08:20:07.264291] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7016996763660652303 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.244 [2024-11-17 08:20:07.264318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.244 #47 NEW cov: 12502 ft: 15139 corp: 18/1441b lim: 105 exec/s: 47 rss: 73Mb L: 98/100 MS: 1 CMP- DE: "\014\000\000\000"- 00:07:54.244 [2024-11-17 08:20:07.333720] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7016996767670100321 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.244 [2024-11-17 08:20:07.333747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.244 [2024-11-17 08:20:07.333904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.244 [2024-11-17 08:20:07.333930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.245 #48 NEW cov: 12502 ft: 15176 corp: 19/1494b lim: 105 exec/s: 48 rss: 73Mb L: 53/100 MS: 1 EraseBytes- 00:07:54.556 [2024-11-17 08:20:07.404305] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.556 [2024-11-17 08:20:07.404342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.556 [2024-11-17 08:20:07.404455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.556 [2024-11-17 08:20:07.404484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.556 [2024-11-17 08:20:07.404599] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.556 [2024-11-17 08:20:07.404629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.556 [2024-11-17 08:20:07.404733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.556 [2024-11-17 08:20:07.404760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.556 #49 NEW cov: 12502 ft: 15217 corp: 20/1598b lim: 105 exec/s: 49 rss: 73Mb L: 104/104 MS: 1 InsertRepeatedBytes- 00:07:54.556 [2024-11-17 08:20:07.454149] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.556 [2024-11-17 08:20:07.454186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.556 [2024-11-17 08:20:07.454283] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.556 [2024-11-17 08:20:07.454310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.556 [2024-11-17 08:20:07.454446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1633771776 len:50 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.556 [2024-11-17 08:20:07.454468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.556 #50 NEW cov: 12502 ft: 15300 corp: 21/1674b lim: 105 exec/s: 50 rss: 73Mb L: 76/104 MS: 1 InsertByte- 00:07:54.556 [2024-11-17 08:20:07.524650] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.556 [2024-11-17 08:20:07.524684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.556 [2024-11-17 08:20:07.524773] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.556 [2024-11-17 08:20:07.524800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.556 [2024-11-17 08:20:07.524923] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.556 [2024-11-17 08:20:07.524953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.556 [2024-11-17 08:20:07.525092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.556 [2024-11-17 08:20:07.525115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.556 #51 NEW cov: 12502 ft: 15323 corp: 22/1775b lim: 105 exec/s: 51 rss: 73Mb L: 101/104 MS: 1 CopyPart- 00:07:54.557 [2024-11-17 08:20:07.574846] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.557 [2024-11-17 08:20:07.574878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.557 [2024-11-17 08:20:07.574970] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.557 [2024-11-17 08:20:07.574992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.557 [2024-11-17 08:20:07.575112] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.557 [2024-11-17 08:20:07.575136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.557 [2024-11-17 08:20:07.575274] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.557 [2024-11-17 08:20:07.575297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.557 #52 NEW cov: 12502 ft: 15339 corp: 23/1865b lim: 105 exec/s: 52 rss: 73Mb L: 90/104 MS: 1 CopyPart- 00:07:54.557 [2024-11-17 08:20:07.624981] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:3073 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.557 [2024-11-17 08:20:07.625012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.557 [2024-11-17 08:20:07.625094] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.557 [2024-11-17 08:20:07.625117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.557 [2024-11-17 08:20:07.625244] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.557 [2024-11-17 08:20:07.625269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.557 [2024-11-17 08:20:07.625410] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.557 [2024-11-17 08:20:07.625437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.557 #53 NEW cov: 12502 ft: 15409 corp: 24/1966b lim: 105 exec/s: 53 rss: 73Mb L: 101/104 MS: 1 PersAutoDict- DE: "\014\000\000\000"- 00:07:54.816 [2024-11-17 08:20:07.694918] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.816 [2024-11-17 08:20:07.694956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.816 [2024-11-17 08:20:07.695075] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7017841190223569249 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.816 [2024-11-17 08:20:07.695102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.816 #54 NEW cov: 12502 ft: 15424 corp: 25/2011b lim: 105 exec/s: 54 rss: 74Mb L: 45/104 MS: 1 ChangeBinInt- 00:07:54.816 [2024-11-17 08:20:07.765500] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:25088 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.816 [2024-11-17 08:20:07.765538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.816 [2024-11-17 08:20:07.765641] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996767954632703 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.816 [2024-11-17 08:20:07.765665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.816 [2024-11-17 08:20:07.765807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.816 [2024-11-17 08:20:07.765831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.816 [2024-11-17 08:20:07.765962] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.816 [2024-11-17 08:20:07.765987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.816 #55 NEW cov: 12502 ft: 15427 corp: 26/2114b lim: 105 exec/s: 55 rss: 74Mb L: 103/104 MS: 1 InsertRepeatedBytes- 00:07:54.816 [2024-11-17 08:20:07.815313] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7016996767670100321 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.816 [2024-11-17 08:20:07.815351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.816 [2024-11-17 08:20:07.815484] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.816 [2024-11-17 08:20:07.815512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.816 #56 NEW cov: 12502 ft: 15458 corp: 27/2174b lim: 105 exec/s: 56 rss: 74Mb L: 60/104 MS: 1 EraseBytes- 00:07:54.816 [2024-11-17 08:20:07.865924] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:3073 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.817 [2024-11-17 08:20:07.865957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.817 [2024-11-17 08:20:07.866046] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.817 [2024-11-17 08:20:07.866071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.817 [2024-11-17 08:20:07.866203] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.817 [2024-11-17 08:20:07.866230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.817 [2024-11-17 08:20:07.866364] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.817 [2024-11-17 08:20:07.866388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.817 #57 NEW cov: 12502 ft: 15471 corp: 28/2278b lim: 105 exec/s: 57 rss: 74Mb L: 104/104 MS: 1 InsertRepeatedBytes- 00:07:54.817 [2024-11-17 08:20:07.935963] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5980887378398175585 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.817 [2024-11-17 08:20:07.936005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.817 [2024-11-17 08:20:07.936135] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.817 [2024-11-17 08:20:07.936162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.817 [2024-11-17 08:20:07.936290] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7016996765293437281 len:24930 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.817 [2024-11-17 08:20:07.936316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.076 #58 NEW cov: 12502 ft: 15527 corp: 29/2361b lim: 105 exec/s: 29 rss: 74Mb L: 83/104 MS: 1 ChangeBit- 00:07:55.076 #58 DONE cov: 12502 ft: 15527 corp: 29/2361b lim: 105 exec/s: 29 rss: 74Mb 00:07:55.076 ###### Recommended dictionary. ###### 00:07:55.076 "\014\000\000\000" # Uses: 1 00:07:55.076 ###### End of recommended dictionary. ###### 00:07:55.076 Done 58 runs in 2 second(s) 00:07:55.076 08:20:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_16.conf /var/tmp/suppress_nvmf_fuzz 00:07:55.076 08:20:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:55.076 08:20:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:55.077 08:20:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:07:55.077 08:20:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:07:55.077 08:20:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:55.077 08:20:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:55.077 08:20:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:55.077 08:20:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:07:55.077 08:20:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:55.077 08:20:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:55.077 08:20:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 17 00:07:55.077 08:20:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4417 00:07:55.077 08:20:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:55.077 08:20:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:07:55.077 08:20:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:55.077 08:20:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:55.077 08:20:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:55.077 08:20:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 00:07:55.077 [2024-11-17 08:20:08.139074] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:55.077 [2024-11-17 08:20:08.139161] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid997014 ] 00:07:55.337 [2024-11-17 08:20:08.320953] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.337 [2024-11-17 08:20:08.342769] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.337 [2024-11-17 08:20:08.395028] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:55.337 [2024-11-17 08:20:08.411310] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:07:55.337 INFO: Running with entropic power schedule (0xFF, 100). 00:07:55.337 INFO: Seed: 3865778564 00:07:55.337 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:55.337 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:55.337 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:55.337 INFO: A corpus is not provided, starting from an empty corpus 00:07:55.337 #2 INITED exec/s: 0 rss: 65Mb 00:07:55.337 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:55.337 This may also happen if the target rejected all inputs we tried so far 00:07:55.337 [2024-11-17 08:20:08.466795] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.337 [2024-11-17 08:20:08.466826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.337 [2024-11-17 08:20:08.466876] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.337 [2024-11-17 08:20:08.466892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.855 NEW_FUNC[1/716]: 0x473508 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:07:55.855 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:55.855 #12 NEW cov: 12296 ft: 12295 corp: 2/65b lim: 120 exec/s: 0 rss: 72Mb L: 64/64 MS: 5 ChangeBit-ShuffleBytes-CopyPart-ChangeBit-InsertRepeatedBytes- 00:07:55.855 [2024-11-17 08:20:08.797564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.855 [2024-11-17 08:20:08.797597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.856 [2024-11-17 08:20:08.797651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.856 [2024-11-17 08:20:08.797668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.856 #13 NEW cov: 12409 ft: 12735 corp: 3/129b lim: 120 exec/s: 0 rss: 72Mb L: 64/64 MS: 1 ShuffleBytes- 00:07:55.856 [2024-11-17 08:20:08.857707] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.856 [2024-11-17 08:20:08.857735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.856 [2024-11-17 08:20:08.857773] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.856 [2024-11-17 08:20:08.857789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.856 #14 NEW cov: 12415 ft: 12954 corp: 4/193b lim: 120 exec/s: 0 rss: 72Mb L: 64/64 MS: 1 ShuffleBytes- 00:07:55.856 [2024-11-17 08:20:08.898080] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.856 [2024-11-17 08:20:08.898109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.856 [2024-11-17 08:20:08.898150] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2242545357980376863 len:7968 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.856 [2024-11-17 08:20:08.898169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.856 [2024-11-17 08:20:08.898218] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2242545357980376863 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.856 [2024-11-17 08:20:08.898234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.856 [2024-11-17 08:20:08.898284] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.856 [2024-11-17 08:20:08.898300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.856 #15 NEW cov: 12500 ft: 13853 corp: 5/301b lim: 120 exec/s: 0 rss: 72Mb L: 108/108 MS: 1 InsertRepeatedBytes- 00:07:55.856 [2024-11-17 08:20:08.957955] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.856 [2024-11-17 08:20:08.957982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.856 [2024-11-17 08:20:08.958039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.856 [2024-11-17 08:20:08.958054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.856 #16 NEW cov: 12500 ft: 13924 corp: 6/365b lim: 120 exec/s: 0 rss: 72Mb L: 64/108 MS: 1 ShuffleBytes- 00:07:56.115 [2024-11-17 08:20:08.998060] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.115 [2024-11-17 08:20:08.998087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.115 [2024-11-17 08:20:08.998125] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.115 [2024-11-17 08:20:08.998141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.115 #17 NEW cov: 12500 ft: 14024 corp: 7/429b lim: 120 exec/s: 0 rss: 72Mb L: 64/108 MS: 1 ChangeBit- 00:07:56.116 [2024-11-17 08:20:09.038167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.116 [2024-11-17 08:20:09.038194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.116 [2024-11-17 08:20:09.038260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:4096 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.116 [2024-11-17 08:20:09.038277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.116 #18 NEW cov: 12500 ft: 14090 corp: 8/493b lim: 120 exec/s: 0 rss: 72Mb L: 64/108 MS: 1 ChangeBit- 00:07:56.116 [2024-11-17 08:20:09.078292] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.116 [2024-11-17 08:20:09.078319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.116 [2024-11-17 08:20:09.078375] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.116 [2024-11-17 08:20:09.078391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.116 #19 NEW cov: 12500 ft: 14183 corp: 9/544b lim: 120 exec/s: 0 rss: 72Mb L: 51/108 MS: 1 EraseBytes- 00:07:56.116 [2024-11-17 08:20:09.138469] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:8193 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.116 [2024-11-17 08:20:09.138495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.116 [2024-11-17 08:20:09.138548] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.116 [2024-11-17 08:20:09.138564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.116 #20 NEW cov: 12500 ft: 14203 corp: 10/608b lim: 120 exec/s: 0 rss: 72Mb L: 64/108 MS: 1 ChangeBit- 00:07:56.116 [2024-11-17 08:20:09.198772] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.116 [2024-11-17 08:20:09.198799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.116 [2024-11-17 08:20:09.198862] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.116 [2024-11-17 08:20:09.198878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.116 [2024-11-17 08:20:09.198927] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.116 [2024-11-17 08:20:09.198942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.116 #21 NEW cov: 12500 ft: 14549 corp: 11/698b lim: 120 exec/s: 0 rss: 73Mb L: 90/108 MS: 1 InsertRepeatedBytes- 00:07:56.375 [2024-11-17 08:20:09.258814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.375 [2024-11-17 08:20:09.258840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.375 [2024-11-17 08:20:09.258878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:3801088 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.375 [2024-11-17 08:20:09.258894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.375 #22 NEW cov: 12500 ft: 14577 corp: 12/763b lim: 120 exec/s: 0 rss: 73Mb L: 65/108 MS: 1 InsertByte- 00:07:56.375 [2024-11-17 08:20:09.298919] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.375 [2024-11-17 08:20:09.298946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.375 [2024-11-17 08:20:09.298984] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.375 [2024-11-17 08:20:09.298999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.375 #23 NEW cov: 12500 ft: 14616 corp: 13/827b lim: 120 exec/s: 0 rss: 73Mb L: 64/108 MS: 1 ChangeByte- 00:07:56.375 [2024-11-17 08:20:09.338999] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:8193 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.375 [2024-11-17 08:20:09.339025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.375 [2024-11-17 08:20:09.339062] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:176 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.375 [2024-11-17 08:20:09.339077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.375 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:56.375 #24 NEW cov: 12523 ft: 14723 corp: 14/891b lim: 120 exec/s: 0 rss: 73Mb L: 64/108 MS: 1 CMP- DE: "\257\020\317\253\221\226\212\000"- 00:07:56.375 [2024-11-17 08:20:09.399192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.375 [2024-11-17 08:20:09.399218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.375 [2024-11-17 08:20:09.399262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.375 [2024-11-17 08:20:09.399277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.375 #25 NEW cov: 12523 ft: 14747 corp: 15/942b lim: 120 exec/s: 0 rss: 73Mb L: 51/108 MS: 1 CrossOver- 00:07:56.375 [2024-11-17 08:20:09.439342] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.375 [2024-11-17 08:20:09.439371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.375 [2024-11-17 08:20:09.439423] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:53876069761024 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.375 [2024-11-17 08:20:09.439439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.375 #26 NEW cov: 12523 ft: 14803 corp: 16/1006b lim: 120 exec/s: 26 rss: 73Mb L: 64/108 MS: 1 ChangeByte- 00:07:56.375 [2024-11-17 08:20:09.499466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.375 [2024-11-17 08:20:09.499493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.375 [2024-11-17 08:20:09.499557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.375 [2024-11-17 08:20:09.499574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.635 #27 NEW cov: 12523 ft: 14813 corp: 17/1070b lim: 120 exec/s: 27 rss: 73Mb L: 64/108 MS: 1 CrossOver- 00:07:56.635 [2024-11-17 08:20:09.539605] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.635 [2024-11-17 08:20:09.539632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.635 [2024-11-17 08:20:09.539701] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.635 [2024-11-17 08:20:09.539718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.635 #28 NEW cov: 12523 ft: 14829 corp: 18/1134b lim: 120 exec/s: 28 rss: 73Mb L: 64/108 MS: 1 ChangeByte- 00:07:56.635 [2024-11-17 08:20:09.579722] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.635 [2024-11-17 08:20:09.579747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.635 [2024-11-17 08:20:09.579801] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:53876069761024 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.635 [2024-11-17 08:20:09.579817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.635 #29 NEW cov: 12523 ft: 14867 corp: 19/1198b lim: 120 exec/s: 29 rss: 73Mb L: 64/108 MS: 1 ChangeByte- 00:07:56.635 [2024-11-17 08:20:09.639882] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.635 [2024-11-17 08:20:09.639908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.636 [2024-11-17 08:20:09.639944] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:3801088 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.636 [2024-11-17 08:20:09.639959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.636 #30 NEW cov: 12523 ft: 14929 corp: 20/1263b lim: 120 exec/s: 30 rss: 73Mb L: 65/108 MS: 1 ShuffleBytes- 00:07:56.636 [2024-11-17 08:20:09.699909] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.636 [2024-11-17 08:20:09.699936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.636 #31 NEW cov: 12523 ft: 15720 corp: 21/1306b lim: 120 exec/s: 31 rss: 73Mb L: 43/108 MS: 1 EraseBytes- 00:07:56.636 [2024-11-17 08:20:09.740130] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.636 [2024-11-17 08:20:09.740156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.636 [2024-11-17 08:20:09.740193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:3801088 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.636 [2024-11-17 08:20:09.740208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.895 #32 NEW cov: 12523 ft: 15730 corp: 22/1371b lim: 120 exec/s: 32 rss: 73Mb L: 65/108 MS: 1 ChangeByte- 00:07:56.895 [2024-11-17 08:20:09.800157] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.895 [2024-11-17 08:20:09.800184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.896 #33 NEW cov: 12523 ft: 15802 corp: 23/1408b lim: 120 exec/s: 33 rss: 73Mb L: 37/108 MS: 1 EraseBytes- 00:07:56.896 [2024-11-17 08:20:09.840434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.896 [2024-11-17 08:20:09.840462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.896 [2024-11-17 08:20:09.840528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:144115188075855872 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.896 [2024-11-17 08:20:09.840544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.896 #39 NEW cov: 12523 ft: 15818 corp: 24/1472b lim: 120 exec/s: 39 rss: 73Mb L: 64/108 MS: 1 ChangeBit- 00:07:56.896 [2024-11-17 08:20:09.880427] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.896 [2024-11-17 08:20:09.880453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.896 #40 NEW cov: 12523 ft: 15825 corp: 25/1509b lim: 120 exec/s: 40 rss: 73Mb L: 37/108 MS: 1 ChangeBit- 00:07:56.896 [2024-11-17 08:20:09.940882] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.896 [2024-11-17 08:20:09.940909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.896 [2024-11-17 08:20:09.940954] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.896 [2024-11-17 08:20:09.940976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.896 [2024-11-17 08:20:09.941027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12876550762191303346 len:45747 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.896 [2024-11-17 08:20:09.941043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.896 #41 NEW cov: 12523 ft: 15867 corp: 26/1590b lim: 120 exec/s: 41 rss: 73Mb L: 81/108 MS: 1 InsertRepeatedBytes- 00:07:56.896 [2024-11-17 08:20:10.000922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.896 [2024-11-17 08:20:10.000949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.896 [2024-11-17 08:20:10.000989] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.896 [2024-11-17 08:20:10.001006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.896 #42 NEW cov: 12523 ft: 15887 corp: 27/1654b lim: 120 exec/s: 42 rss: 73Mb L: 64/108 MS: 1 ChangeBit- 00:07:57.155 [2024-11-17 08:20:10.041072] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:593678565376 len:64620 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.155 [2024-11-17 08:20:10.041101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.155 [2024-11-17 08:20:10.041140] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:144115188075855872 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.155 [2024-11-17 08:20:10.041156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.155 #43 NEW cov: 12523 ft: 15932 corp: 28/1718b lim: 120 exec/s: 43 rss: 74Mb L: 64/108 MS: 1 CMP- DE: "\000\212\226\226\374k\222\316"- 00:07:57.155 [2024-11-17 08:20:10.101230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.156 [2024-11-17 08:20:10.101257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.156 [2024-11-17 08:20:10.101294] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:3211264 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.156 [2024-11-17 08:20:10.101310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.156 #44 NEW cov: 12523 ft: 15987 corp: 29/1777b lim: 120 exec/s: 44 rss: 74Mb L: 59/108 MS: 1 EraseBytes- 00:07:57.156 [2024-11-17 08:20:10.141358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.156 [2024-11-17 08:20:10.141384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.156 [2024-11-17 08:20:10.141422] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:53876069761024 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.156 [2024-11-17 08:20:10.141435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.156 #45 NEW cov: 12523 ft: 16021 corp: 30/1842b lim: 120 exec/s: 45 rss: 74Mb L: 65/108 MS: 1 InsertByte- 00:07:57.156 [2024-11-17 08:20:10.201466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.156 [2024-11-17 08:20:10.201492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.156 [2024-11-17 08:20:10.201560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.156 [2024-11-17 08:20:10.201575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.156 #46 NEW cov: 12523 ft: 16047 corp: 31/1907b lim: 120 exec/s: 46 rss: 74Mb L: 65/108 MS: 1 InsertByte- 00:07:57.156 [2024-11-17 08:20:10.241882] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.156 [2024-11-17 08:20:10.241908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.156 [2024-11-17 08:20:10.241949] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.156 [2024-11-17 08:20:10.241964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.156 [2024-11-17 08:20:10.242015] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.156 [2024-11-17 08:20:10.242031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.156 [2024-11-17 08:20:10.242085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.156 [2024-11-17 08:20:10.242100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.156 #47 NEW cov: 12523 ft: 16058 corp: 32/2014b lim: 120 exec/s: 47 rss: 74Mb L: 107/108 MS: 1 InsertRepeatedBytes- 00:07:57.416 [2024-11-17 08:20:10.301927] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.416 [2024-11-17 08:20:10.301954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.416 [2024-11-17 08:20:10.301989] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.416 [2024-11-17 08:20:10.302005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.416 [2024-11-17 08:20:10.302059] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.416 [2024-11-17 08:20:10.302074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.416 #48 NEW cov: 12523 ft: 16062 corp: 33/2104b lim: 120 exec/s: 48 rss: 74Mb L: 90/108 MS: 1 ShuffleBytes- 00:07:57.416 [2024-11-17 08:20:10.341892] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.416 [2024-11-17 08:20:10.341919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.416 [2024-11-17 08:20:10.341972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.416 [2024-11-17 08:20:10.341988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.416 #49 NEW cov: 12523 ft: 16077 corp: 34/2167b lim: 120 exec/s: 49 rss: 74Mb L: 63/108 MS: 1 CrossOver- 00:07:57.416 [2024-11-17 08:20:10.402007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.416 [2024-11-17 08:20:10.402034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.416 [2024-11-17 08:20:10.402107] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:144115188075855872 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.416 [2024-11-17 08:20:10.402123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.416 #50 NEW cov: 12523 ft: 16085 corp: 35/2231b lim: 120 exec/s: 50 rss: 74Mb L: 64/108 MS: 1 ChangeByte- 00:07:57.416 [2024-11-17 08:20:10.442249] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:10923366096979361792 len:38808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.416 [2024-11-17 08:20:10.442275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.416 [2024-11-17 08:20:10.442314] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:10923199423753721751 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.416 [2024-11-17 08:20:10.442329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.416 [2024-11-17 08:20:10.442380] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.416 [2024-11-17 08:20:10.442394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.416 #51 NEW cov: 12523 ft: 16095 corp: 36/2308b lim: 120 exec/s: 25 rss: 74Mb L: 77/108 MS: 1 InsertRepeatedBytes- 00:07:57.416 #51 DONE cov: 12523 ft: 16095 corp: 36/2308b lim: 120 exec/s: 25 rss: 74Mb 00:07:57.416 ###### Recommended dictionary. ###### 00:07:57.416 "\257\020\317\253\221\226\212\000" # Uses: 1 00:07:57.416 "\000\212\226\226\374k\222\316" # Uses: 0 00:07:57.416 ###### End of recommended dictionary. ###### 00:07:57.416 Done 51 runs in 2 second(s) 00:07:57.675 08:20:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_17.conf /var/tmp/suppress_nvmf_fuzz 00:07:57.675 08:20:10 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:57.676 08:20:10 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:57.676 08:20:10 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:07:57.676 08:20:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:07:57.676 08:20:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:57.676 08:20:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:57.676 08:20:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:57.676 08:20:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:07:57.676 08:20:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:57.676 08:20:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:57.676 08:20:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 18 00:07:57.676 08:20:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4418 00:07:57.676 08:20:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:57.676 08:20:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:07:57.676 08:20:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:57.676 08:20:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:57.676 08:20:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:57.676 08:20:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 00:07:57.676 [2024-11-17 08:20:10.617938] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:07:57.676 [2024-11-17 08:20:10.618020] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid997374 ] 00:07:57.676 [2024-11-17 08:20:10.797936] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.935 [2024-11-17 08:20:10.820192] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.935 [2024-11-17 08:20:10.872637] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:57.935 [2024-11-17 08:20:10.888969] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:07:57.935 INFO: Running with entropic power schedule (0xFF, 100). 00:07:57.935 INFO: Seed: 2047829306 00:07:57.935 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:07:57.935 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:07:57.935 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:57.935 INFO: A corpus is not provided, starting from an empty corpus 00:07:57.935 #2 INITED exec/s: 0 rss: 65Mb 00:07:57.935 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:57.935 This may also happen if the target rejected all inputs we tried so far 00:07:57.935 [2024-11-17 08:20:10.955239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:57.935 [2024-11-17 08:20:10.955275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.935 [2024-11-17 08:20:10.955396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:57.935 [2024-11-17 08:20:10.955420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.935 [2024-11-17 08:20:10.955538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:57.935 [2024-11-17 08:20:10.955562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.195 NEW_FUNC[1/714]: 0x476df8 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:07:58.195 NEW_FUNC[2/714]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:58.195 #24 NEW cov: 12239 ft: 12240 corp: 2/62b lim: 100 exec/s: 0 rss: 72Mb L: 61/61 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:58.195 [2024-11-17 08:20:11.286064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:58.195 [2024-11-17 08:20:11.286113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.195 [2024-11-17 08:20:11.286250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:58.195 [2024-11-17 08:20:11.286278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.195 [2024-11-17 08:20:11.286403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:58.195 [2024-11-17 08:20:11.286428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.195 #25 NEW cov: 12352 ft: 12950 corp: 3/123b lim: 100 exec/s: 0 rss: 72Mb L: 61/61 MS: 1 ChangeBinInt- 00:07:58.454 [2024-11-17 08:20:11.356295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:58.454 [2024-11-17 08:20:11.356337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.454 [2024-11-17 08:20:11.356457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:58.455 [2024-11-17 08:20:11.356484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.455 [2024-11-17 08:20:11.356607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:58.455 [2024-11-17 08:20:11.356630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.455 #26 NEW cov: 12358 ft: 13182 corp: 4/185b lim: 100 exec/s: 0 rss: 72Mb L: 62/62 MS: 1 InsertByte- 00:07:58.455 [2024-11-17 08:20:11.426400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:58.455 [2024-11-17 08:20:11.426437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.455 [2024-11-17 08:20:11.426549] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:58.455 [2024-11-17 08:20:11.426567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.455 [2024-11-17 08:20:11.426684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:58.455 [2024-11-17 08:20:11.426714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.455 #27 NEW cov: 12443 ft: 13409 corp: 5/247b lim: 100 exec/s: 0 rss: 72Mb L: 62/62 MS: 1 ChangeBit- 00:07:58.455 [2024-11-17 08:20:11.496663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:58.455 [2024-11-17 08:20:11.496702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.455 [2024-11-17 08:20:11.496818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:58.455 [2024-11-17 08:20:11.496842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.455 [2024-11-17 08:20:11.496963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:58.455 [2024-11-17 08:20:11.496987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.455 #28 NEW cov: 12443 ft: 13499 corp: 6/309b lim: 100 exec/s: 0 rss: 72Mb L: 62/62 MS: 1 ChangeBinInt- 00:07:58.455 [2024-11-17 08:20:11.566914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:58.455 [2024-11-17 08:20:11.566945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.455 [2024-11-17 08:20:11.567058] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:58.455 [2024-11-17 08:20:11.567078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.455 [2024-11-17 08:20:11.567196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:58.455 [2024-11-17 08:20:11.567217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.455 #29 NEW cov: 12443 ft: 13587 corp: 7/370b lim: 100 exec/s: 0 rss: 72Mb L: 61/62 MS: 1 ChangeBinInt- 00:07:58.715 [2024-11-17 08:20:11.616979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:58.715 [2024-11-17 08:20:11.617015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.715 [2024-11-17 08:20:11.617116] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:58.715 [2024-11-17 08:20:11.617136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.715 [2024-11-17 08:20:11.617249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:58.715 [2024-11-17 08:20:11.617272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.715 #30 NEW cov: 12443 ft: 13680 corp: 8/431b lim: 100 exec/s: 0 rss: 72Mb L: 61/62 MS: 1 ChangeBit- 00:07:58.715 [2024-11-17 08:20:11.667186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:58.715 [2024-11-17 08:20:11.667222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.715 [2024-11-17 08:20:11.667333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:58.715 [2024-11-17 08:20:11.667354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.715 [2024-11-17 08:20:11.667465] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:58.715 [2024-11-17 08:20:11.667491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.715 #31 NEW cov: 12443 ft: 13725 corp: 9/493b lim: 100 exec/s: 0 rss: 73Mb L: 62/62 MS: 1 ChangeBit- 00:07:58.715 [2024-11-17 08:20:11.737205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:58.715 [2024-11-17 08:20:11.737237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.715 [2024-11-17 08:20:11.737349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:58.715 [2024-11-17 08:20:11.737369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.715 #32 NEW cov: 12443 ft: 14085 corp: 10/536b lim: 100 exec/s: 0 rss: 73Mb L: 43/62 MS: 1 EraseBytes- 00:07:58.715 [2024-11-17 08:20:11.787505] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:58.715 [2024-11-17 08:20:11.787541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.715 [2024-11-17 08:20:11.787642] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:58.715 [2024-11-17 08:20:11.787665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.715 [2024-11-17 08:20:11.787794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:58.715 [2024-11-17 08:20:11.787814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.715 #33 NEW cov: 12443 ft: 14145 corp: 11/597b lim: 100 exec/s: 0 rss: 73Mb L: 61/62 MS: 1 CrossOver- 00:07:58.715 [2024-11-17 08:20:11.837725] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:58.715 [2024-11-17 08:20:11.837756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.715 [2024-11-17 08:20:11.837855] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:58.715 [2024-11-17 08:20:11.837878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.715 [2024-11-17 08:20:11.837995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:58.715 [2024-11-17 08:20:11.838035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.975 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:58.975 #34 NEW cov: 12466 ft: 14212 corp: 12/659b lim: 100 exec/s: 0 rss: 73Mb L: 62/62 MS: 1 ShuffleBytes- 00:07:58.975 [2024-11-17 08:20:11.887894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:58.975 [2024-11-17 08:20:11.887927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.975 [2024-11-17 08:20:11.888024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:58.975 [2024-11-17 08:20:11.888048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.975 [2024-11-17 08:20:11.888160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:58.975 [2024-11-17 08:20:11.888184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.975 #35 NEW cov: 12466 ft: 14258 corp: 13/720b lim: 100 exec/s: 0 rss: 73Mb L: 61/62 MS: 1 ChangeBinInt- 00:07:58.975 [2024-11-17 08:20:11.937982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:58.975 [2024-11-17 08:20:11.938014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.975 [2024-11-17 08:20:11.938095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:58.975 [2024-11-17 08:20:11.938116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.975 [2024-11-17 08:20:11.938227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:58.975 [2024-11-17 08:20:11.938251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.975 #36 NEW cov: 12466 ft: 14279 corp: 14/783b lim: 100 exec/s: 36 rss: 73Mb L: 63/63 MS: 1 InsertByte- 00:07:58.975 [2024-11-17 08:20:11.987940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:58.975 [2024-11-17 08:20:11.987972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.975 [2024-11-17 08:20:11.988081] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:58.975 [2024-11-17 08:20:11.988107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.975 #37 NEW cov: 12466 ft: 14425 corp: 15/826b lim: 100 exec/s: 37 rss: 73Mb L: 43/63 MS: 1 CopyPart- 00:07:58.975 [2024-11-17 08:20:12.058808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:58.975 [2024-11-17 08:20:12.058842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.975 [2024-11-17 08:20:12.058951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:58.975 [2024-11-17 08:20:12.058970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.975 [2024-11-17 08:20:12.059081] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:58.975 [2024-11-17 08:20:12.059105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.975 [2024-11-17 08:20:12.059219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:58.975 [2024-11-17 08:20:12.059241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.975 #38 NEW cov: 12466 ft: 14717 corp: 16/924b lim: 100 exec/s: 38 rss: 73Mb L: 98/98 MS: 1 InsertRepeatedBytes- 00:07:58.975 [2024-11-17 08:20:12.108606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:58.975 [2024-11-17 08:20:12.108641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.975 [2024-11-17 08:20:12.108749] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:58.975 [2024-11-17 08:20:12.108772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.975 [2024-11-17 08:20:12.108896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:58.975 [2024-11-17 08:20:12.108920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.235 #39 NEW cov: 12466 ft: 14775 corp: 17/986b lim: 100 exec/s: 39 rss: 73Mb L: 62/98 MS: 1 InsertByte- 00:07:59.235 [2024-11-17 08:20:12.179043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.235 [2024-11-17 08:20:12.179075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.235 [2024-11-17 08:20:12.179149] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.235 [2024-11-17 08:20:12.179175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.235 [2024-11-17 08:20:12.179295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.235 [2024-11-17 08:20:12.179313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.235 [2024-11-17 08:20:12.179426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:59.235 [2024-11-17 08:20:12.179448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.235 #40 NEW cov: 12466 ft: 14847 corp: 18/1071b lim: 100 exec/s: 40 rss: 73Mb L: 85/98 MS: 1 InsertRepeatedBytes- 00:07:59.235 [2024-11-17 08:20:12.248977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.235 [2024-11-17 08:20:12.249012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.235 [2024-11-17 08:20:12.249117] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.235 [2024-11-17 08:20:12.249139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.235 [2024-11-17 08:20:12.249258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.235 [2024-11-17 08:20:12.249282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.235 #41 NEW cov: 12466 ft: 14857 corp: 19/1132b lim: 100 exec/s: 41 rss: 73Mb L: 61/98 MS: 1 CrossOver- 00:07:59.235 [2024-11-17 08:20:12.319225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.235 [2024-11-17 08:20:12.319255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.235 [2024-11-17 08:20:12.319348] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.235 [2024-11-17 08:20:12.319367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.235 [2024-11-17 08:20:12.319482] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.235 [2024-11-17 08:20:12.319505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.235 #42 NEW cov: 12466 ft: 14869 corp: 20/1194b lim: 100 exec/s: 42 rss: 73Mb L: 62/98 MS: 1 ChangeBit- 00:07:59.495 [2024-11-17 08:20:12.389325] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.495 [2024-11-17 08:20:12.389363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.495 [2024-11-17 08:20:12.389493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.495 [2024-11-17 08:20:12.389518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.495 [2024-11-17 08:20:12.389636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.495 [2024-11-17 08:20:12.389656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.495 #43 NEW cov: 12466 ft: 14905 corp: 21/1259b lim: 100 exec/s: 43 rss: 73Mb L: 65/98 MS: 1 CopyPart- 00:07:59.495 [2024-11-17 08:20:12.439498] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.495 [2024-11-17 08:20:12.439531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.495 [2024-11-17 08:20:12.439653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.495 [2024-11-17 08:20:12.439676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.495 [2024-11-17 08:20:12.439803] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.495 [2024-11-17 08:20:12.439829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.495 #44 NEW cov: 12466 ft: 14919 corp: 22/1321b lim: 100 exec/s: 44 rss: 73Mb L: 62/98 MS: 1 ChangeByte- 00:07:59.495 [2024-11-17 08:20:12.509733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.495 [2024-11-17 08:20:12.509764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.495 [2024-11-17 08:20:12.509851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.495 [2024-11-17 08:20:12.509875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.495 [2024-11-17 08:20:12.509993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.495 [2024-11-17 08:20:12.510016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.495 #45 NEW cov: 12466 ft: 14955 corp: 23/1393b lim: 100 exec/s: 45 rss: 73Mb L: 72/98 MS: 1 InsertRepeatedBytes- 00:07:59.495 [2024-11-17 08:20:12.559884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.495 [2024-11-17 08:20:12.559917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.495 [2024-11-17 08:20:12.560014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.495 [2024-11-17 08:20:12.560038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.495 [2024-11-17 08:20:12.560146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.495 [2024-11-17 08:20:12.560170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.495 #46 NEW cov: 12466 ft: 14969 corp: 24/1454b lim: 100 exec/s: 46 rss: 73Mb L: 61/98 MS: 1 ShuffleBytes- 00:07:59.495 [2024-11-17 08:20:12.610001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.495 [2024-11-17 08:20:12.610049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.495 [2024-11-17 08:20:12.610172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.495 [2024-11-17 08:20:12.610198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.495 [2024-11-17 08:20:12.610314] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.495 [2024-11-17 08:20:12.610337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.755 #47 NEW cov: 12466 ft: 15008 corp: 25/1516b lim: 100 exec/s: 47 rss: 73Mb L: 62/98 MS: 1 ChangeBinInt- 00:07:59.755 [2024-11-17 08:20:12.680245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.755 [2024-11-17 08:20:12.680280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.755 [2024-11-17 08:20:12.680376] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.755 [2024-11-17 08:20:12.680418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.755 [2024-11-17 08:20:12.680534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.755 [2024-11-17 08:20:12.680555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.755 #48 NEW cov: 12466 ft: 15082 corp: 26/1577b lim: 100 exec/s: 48 rss: 74Mb L: 61/98 MS: 1 ChangeBit- 00:07:59.755 [2024-11-17 08:20:12.750448] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.755 [2024-11-17 08:20:12.750479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.755 [2024-11-17 08:20:12.750583] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.755 [2024-11-17 08:20:12.750605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.755 [2024-11-17 08:20:12.750721] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.755 [2024-11-17 08:20:12.750743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.755 #49 NEW cov: 12466 ft: 15088 corp: 27/1638b lim: 100 exec/s: 49 rss: 74Mb L: 61/98 MS: 1 ChangeByte- 00:07:59.755 [2024-11-17 08:20:12.800255] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.755 [2024-11-17 08:20:12.800283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.755 #50 NEW cov: 12466 ft: 15396 corp: 28/1677b lim: 100 exec/s: 50 rss: 74Mb L: 39/98 MS: 1 EraseBytes- 00:07:59.755 [2024-11-17 08:20:12.870821] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.755 [2024-11-17 08:20:12.870850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.755 [2024-11-17 08:20:12.870927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.755 [2024-11-17 08:20:12.870946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.755 [2024-11-17 08:20:12.871058] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.755 [2024-11-17 08:20:12.871080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.755 #51 NEW cov: 12466 ft: 15434 corp: 29/1756b lim: 100 exec/s: 51 rss: 74Mb L: 79/98 MS: 1 InsertRepeatedBytes- 00:08:00.015 [2024-11-17 08:20:12.920962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.015 [2024-11-17 08:20:12.920996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.015 [2024-11-17 08:20:12.921105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:00.015 [2024-11-17 08:20:12.921128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.015 [2024-11-17 08:20:12.921236] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:00.015 [2024-11-17 08:20:12.921260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.015 #52 NEW cov: 12466 ft: 15466 corp: 30/1817b lim: 100 exec/s: 26 rss: 74Mb L: 61/98 MS: 1 ChangeBinInt- 00:08:00.015 #52 DONE cov: 12466 ft: 15466 corp: 30/1817b lim: 100 exec/s: 26 rss: 74Mb 00:08:00.015 Done 52 runs in 2 second(s) 00:08:00.015 08:20:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_18.conf /var/tmp/suppress_nvmf_fuzz 00:08:00.015 08:20:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:00.015 08:20:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:00.015 08:20:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:00.015 08:20:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:00.015 08:20:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:00.015 08:20:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:00.015 08:20:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:00.015 08:20:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:00.015 08:20:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:00.015 08:20:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:00.015 08:20:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 19 00:08:00.015 08:20:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4419 00:08:00.015 08:20:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:00.015 08:20:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:00.015 08:20:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:00.015 08:20:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:00.015 08:20:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:00.015 08:20:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 00:08:00.015 [2024-11-17 08:20:13.112305] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:00.015 [2024-11-17 08:20:13.112383] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid997835 ] 00:08:00.274 [2024-11-17 08:20:13.288031] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.275 [2024-11-17 08:20:13.309765] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.275 [2024-11-17 08:20:13.362113] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:00.275 [2024-11-17 08:20:13.378377] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:00.275 INFO: Running with entropic power schedule (0xFF, 100). 00:08:00.275 INFO: Seed: 242845630 00:08:00.535 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:00.535 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:00.535 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:00.535 INFO: A corpus is not provided, starting from an empty corpus 00:08:00.535 #2 INITED exec/s: 0 rss: 65Mb 00:08:00.535 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:00.535 This may also happen if the target rejected all inputs we tried so far 00:08:00.535 [2024-11-17 08:20:13.454522] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:174483046 len:1 00:08:00.535 [2024-11-17 08:20:13.454557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.535 [2024-11-17 08:20:13.454683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:00.535 [2024-11-17 08:20:13.454708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.794 NEW_FUNC[1/714]: 0x479db8 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:00.794 NEW_FUNC[2/714]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:00.794 #4 NEW cov: 12199 ft: 12200 corp: 2/22b lim: 50 exec/s: 0 rss: 72Mb L: 21/21 MS: 2 InsertRepeatedBytes-InsertRepeatedBytes- 00:08:00.794 [2024-11-17 08:20:13.785553] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:174483046 len:1 00:08:00.794 [2024-11-17 08:20:13.785616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.794 [2024-11-17 08:20:13.785757] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:60160 len:1 00:08:00.794 [2024-11-17 08:20:13.785793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.794 #5 NEW cov: 12329 ft: 12797 corp: 3/43b lim: 50 exec/s: 0 rss: 72Mb L: 21/21 MS: 1 ChangeByte- 00:08:00.794 [2024-11-17 08:20:13.855566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:37999122030421606 len:1 00:08:00.794 [2024-11-17 08:20:13.855600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.794 [2024-11-17 08:20:13.855720] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:235 len:1 00:08:00.794 [2024-11-17 08:20:13.855745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.794 #6 NEW cov: 12335 ft: 13062 corp: 4/65b lim: 50 exec/s: 0 rss: 72Mb L: 22/22 MS: 1 InsertByte- 00:08:00.794 [2024-11-17 08:20:13.926378] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:00.794 [2024-11-17 08:20:13.926414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.794 [2024-11-17 08:20:13.926494] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:00.795 [2024-11-17 08:20:13.926520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.795 [2024-11-17 08:20:13.926636] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:00.795 [2024-11-17 08:20:13.926661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.795 [2024-11-17 08:20:13.926776] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:00.795 [2024-11-17 08:20:13.926800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.795 [2024-11-17 08:20:13.926911] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65291 00:08:00.795 [2024-11-17 08:20:13.926937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:01.054 #7 NEW cov: 12420 ft: 13676 corp: 5/115b lim: 50 exec/s: 0 rss: 72Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:08:01.054 [2024-11-17 08:20:13.965842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:258385407010406 len:1 00:08:01.054 [2024-11-17 08:20:13.965874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.054 [2024-11-17 08:20:13.965989] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:258385232527360 len:1 00:08:01.054 [2024-11-17 08:20:13.966009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.054 #8 NEW cov: 12420 ft: 13892 corp: 6/140b lim: 50 exec/s: 0 rss: 72Mb L: 25/50 MS: 1 CopyPart- 00:08:01.054 [2024-11-17 08:20:14.015985] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:258385407010406 len:1 00:08:01.054 [2024-11-17 08:20:14.016021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.054 [2024-11-17 08:20:14.016132] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:60161 00:08:01.055 [2024-11-17 08:20:14.016154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.055 #9 NEW cov: 12420 ft: 14010 corp: 7/165b lim: 50 exec/s: 0 rss: 72Mb L: 25/50 MS: 1 CopyPart- 00:08:01.055 [2024-11-17 08:20:14.085993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:234881024 len:1 00:08:01.055 [2024-11-17 08:20:14.086026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.055 #13 NEW cov: 12420 ft: 14392 corp: 8/184b lim: 50 exec/s: 0 rss: 72Mb L: 19/50 MS: 4 CopyPart-ChangeBit-EraseBytes-InsertRepeatedBytes- 00:08:01.055 [2024-11-17 08:20:14.136221] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446743283435569151 len:2571 00:08:01.055 [2024-11-17 08:20:14.136246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.055 #15 NEW cov: 12420 ft: 14481 corp: 9/194b lim: 50 exec/s: 0 rss: 72Mb L: 10/50 MS: 2 CopyPart-CMP- DE: "\377\377\377\377\377\377\377G"- 00:08:01.055 [2024-11-17 08:20:14.187001] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:01.055 [2024-11-17 08:20:14.187031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.055 [2024-11-17 08:20:14.187113] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:01.055 [2024-11-17 08:20:14.187135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.055 [2024-11-17 08:20:14.187244] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:01.055 [2024-11-17 08:20:14.187265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.055 [2024-11-17 08:20:14.187381] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:01.055 [2024-11-17 08:20:14.187409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.055 [2024-11-17 08:20:14.187538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65291 00:08:01.055 [2024-11-17 08:20:14.187563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:01.314 #16 NEW cov: 12420 ft: 14569 corp: 10/244b lim: 50 exec/s: 0 rss: 72Mb L: 50/50 MS: 1 ShuffleBytes- 00:08:01.314 [2024-11-17 08:20:14.256687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:258385407010406 len:3 00:08:01.314 [2024-11-17 08:20:14.256724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.314 [2024-11-17 08:20:14.256817] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:258385232527360 len:1 00:08:01.314 [2024-11-17 08:20:14.256845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.314 #17 NEW cov: 12420 ft: 14636 corp: 11/269b lim: 50 exec/s: 0 rss: 72Mb L: 25/50 MS: 1 ChangeBit- 00:08:01.314 [2024-11-17 08:20:14.306825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:37999122030421606 len:1 00:08:01.314 [2024-11-17 08:20:14.306855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.314 [2024-11-17 08:20:14.306941] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:235 len:129 00:08:01.314 [2024-11-17 08:20:14.306966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.314 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:01.314 #18 NEW cov: 12443 ft: 14669 corp: 12/291b lim: 50 exec/s: 0 rss: 73Mb L: 22/50 MS: 1 ChangeBit- 00:08:01.314 [2024-11-17 08:20:14.377237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:258385407010406 len:1 00:08:01.314 [2024-11-17 08:20:14.377270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.314 [2024-11-17 08:20:14.377355] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:60160 len:1 00:08:01.314 [2024-11-17 08:20:14.377377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.314 [2024-11-17 08:20:14.377493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15400960 len:1 00:08:01.314 [2024-11-17 08:20:14.377516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.314 #19 NEW cov: 12443 ft: 14912 corp: 13/321b lim: 50 exec/s: 0 rss: 73Mb L: 30/50 MS: 1 CrossOver- 00:08:01.314 [2024-11-17 08:20:14.427554] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069584977919 len:65536 00:08:01.314 [2024-11-17 08:20:14.427583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.314 [2024-11-17 08:20:14.427668] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:01.314 [2024-11-17 08:20:14.427690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.314 [2024-11-17 08:20:14.427806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:01.314 [2024-11-17 08:20:14.427831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.314 [2024-11-17 08:20:14.427948] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:01.314 [2024-11-17 08:20:14.427971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.314 #22 NEW cov: 12443 ft: 14944 corp: 14/369b lim: 50 exec/s: 22 rss: 73Mb L: 48/50 MS: 3 ShuffleBytes-InsertByte-CrossOver- 00:08:01.574 [2024-11-17 08:20:14.477871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:01.574 [2024-11-17 08:20:14.477905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.574 [2024-11-17 08:20:14.478006] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:01.574 [2024-11-17 08:20:14.478032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.574 [2024-11-17 08:20:14.478137] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:01.574 [2024-11-17 08:20:14.478161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.574 [2024-11-17 08:20:14.478283] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:01.574 [2024-11-17 08:20:14.478302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.574 [2024-11-17 08:20:14.478423] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65291 00:08:01.574 [2024-11-17 08:20:14.478448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:01.574 #23 NEW cov: 12443 ft: 14982 corp: 15/419b lim: 50 exec/s: 23 rss: 73Mb L: 50/50 MS: 1 CopyPart- 00:08:01.574 [2024-11-17 08:20:14.547863] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446673700840800255 len:65536 00:08:01.574 [2024-11-17 08:20:14.547897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.574 [2024-11-17 08:20:14.547991] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:01.574 [2024-11-17 08:20:14.548017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.574 [2024-11-17 08:20:14.548132] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:01.574 [2024-11-17 08:20:14.548154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.574 [2024-11-17 08:20:14.548276] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:01.574 [2024-11-17 08:20:14.548302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.574 #24 NEW cov: 12443 ft: 15010 corp: 16/467b lim: 50 exec/s: 24 rss: 73Mb L: 48/50 MS: 1 ChangeBit- 00:08:01.574 [2024-11-17 08:20:14.618111] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446673700840800255 len:65536 00:08:01.574 [2024-11-17 08:20:14.618141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.574 [2024-11-17 08:20:14.618239] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:01.574 [2024-11-17 08:20:14.618265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.574 [2024-11-17 08:20:14.618373] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709504511 len:65536 00:08:01.574 [2024-11-17 08:20:14.618397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.574 [2024-11-17 08:20:14.618507] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:01.574 [2024-11-17 08:20:14.618534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.574 #25 NEW cov: 12443 ft: 15035 corp: 17/515b lim: 50 exec/s: 25 rss: 73Mb L: 48/50 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377G"- 00:08:01.574 [2024-11-17 08:20:14.688378] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069584977919 len:65536 00:08:01.574 [2024-11-17 08:20:14.688413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.574 [2024-11-17 08:20:14.688501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:01.574 [2024-11-17 08:20:14.688524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.574 [2024-11-17 08:20:14.688637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:01.574 [2024-11-17 08:20:14.688660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.574 [2024-11-17 08:20:14.688784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:01.574 [2024-11-17 08:20:14.688805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.574 #26 NEW cov: 12443 ft: 15054 corp: 18/564b lim: 50 exec/s: 26 rss: 73Mb L: 49/50 MS: 1 InsertByte- 00:08:01.836 [2024-11-17 08:20:14.738300] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071478181887 len:65536 00:08:01.836 [2024-11-17 08:20:14.738334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.836 [2024-11-17 08:20:14.738425] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:01.836 [2024-11-17 08:20:14.738450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.836 [2024-11-17 08:20:14.738557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:01.836 [2024-11-17 08:20:14.738578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.836 #28 NEW cov: 12443 ft: 15108 corp: 19/597b lim: 50 exec/s: 28 rss: 73Mb L: 33/50 MS: 2 ChangeByte-CrossOver- 00:08:01.836 [2024-11-17 08:20:14.788478] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071478181887 len:65536 00:08:01.836 [2024-11-17 08:20:14.788511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.836 [2024-11-17 08:20:14.788615] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:01.836 [2024-11-17 08:20:14.788641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.836 [2024-11-17 08:20:14.788763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:12800 00:08:01.836 [2024-11-17 08:20:14.788788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.836 #29 NEW cov: 12443 ft: 15129 corp: 20/630b lim: 50 exec/s: 29 rss: 73Mb L: 33/50 MS: 1 ChangeByte- 00:08:01.836 [2024-11-17 08:20:14.858677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071478181887 len:65536 00:08:01.836 [2024-11-17 08:20:14.858711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.836 [2024-11-17 08:20:14.858803] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:01.836 [2024-11-17 08:20:14.858827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.836 [2024-11-17 08:20:14.858947] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:01.836 [2024-11-17 08:20:14.858973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.836 #30 NEW cov: 12443 ft: 15142 corp: 21/663b lim: 50 exec/s: 30 rss: 73Mb L: 33/50 MS: 1 CopyPart- 00:08:01.836 [2024-11-17 08:20:14.908987] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069584977919 len:65536 00:08:01.836 [2024-11-17 08:20:14.909016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.836 [2024-11-17 08:20:14.909094] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:01.836 [2024-11-17 08:20:14.909117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.836 [2024-11-17 08:20:14.909231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:01.836 [2024-11-17 08:20:14.909255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.836 [2024-11-17 08:20:14.909370] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:01.836 [2024-11-17 08:20:14.909391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.836 #31 NEW cov: 12443 ft: 15160 corp: 22/707b lim: 50 exec/s: 31 rss: 73Mb L: 44/50 MS: 1 EraseBytes- 00:08:01.836 [2024-11-17 08:20:14.968867] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:174483046 len:1 00:08:01.836 [2024-11-17 08:20:14.968903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.836 [2024-11-17 08:20:14.969037] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:38562071809359872 len:1 00:08:01.836 [2024-11-17 08:20:14.969063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.095 #32 NEW cov: 12443 ft: 15182 corp: 23/729b lim: 50 exec/s: 32 rss: 73Mb L: 22/50 MS: 1 InsertByte- 00:08:02.095 [2024-11-17 08:20:15.019184] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071478181887 len:65536 00:08:02.095 [2024-11-17 08:20:15.019221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.095 [2024-11-17 08:20:15.019354] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:02.095 [2024-11-17 08:20:15.019378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.095 [2024-11-17 08:20:15.019500] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744072082161663 len:65536 00:08:02.095 [2024-11-17 08:20:15.019524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.095 #33 NEW cov: 12443 ft: 15212 corp: 24/763b lim: 50 exec/s: 33 rss: 73Mb L: 34/50 MS: 1 InsertByte- 00:08:02.095 [2024-11-17 08:20:15.089403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071478181887 len:65536 00:08:02.095 [2024-11-17 08:20:15.089442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.095 [2024-11-17 08:20:15.089554] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18441544861178920959 len:57498 00:08:02.095 [2024-11-17 08:20:15.089579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.095 [2024-11-17 08:20:15.089702] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744071940210943 len:65536 00:08:02.095 [2024-11-17 08:20:15.089723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.095 #34 NEW cov: 12443 ft: 15219 corp: 25/797b lim: 50 exec/s: 34 rss: 73Mb L: 34/50 MS: 1 CMP- DE: "\355\207W\340\231\226\212\000"- 00:08:02.095 [2024-11-17 08:20:15.159792] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:37999122030421606 len:1 00:08:02.095 [2024-11-17 08:20:15.159828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.095 [2024-11-17 08:20:15.159927] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744069414649855 len:65536 00:08:02.096 [2024-11-17 08:20:15.159952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.096 [2024-11-17 08:20:15.160067] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:02.096 [2024-11-17 08:20:15.160092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.096 [2024-11-17 08:20:15.160210] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446742978492891135 len:60161 00:08:02.096 [2024-11-17 08:20:15.160233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.096 #35 NEW cov: 12443 ft: 15234 corp: 26/844b lim: 50 exec/s: 35 rss: 73Mb L: 47/50 MS: 1 InsertRepeatedBytes- 00:08:02.096 [2024-11-17 08:20:15.230086] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069584977919 len:65536 00:08:02.096 [2024-11-17 08:20:15.230119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.096 [2024-11-17 08:20:15.230209] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709496319 len:65536 00:08:02.096 [2024-11-17 08:20:15.230233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.096 [2024-11-17 08:20:15.230348] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:02.096 [2024-11-17 08:20:15.230380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.096 [2024-11-17 08:20:15.230498] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:02.096 [2024-11-17 08:20:15.230517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.355 #36 NEW cov: 12443 ft: 15238 corp: 27/893b lim: 50 exec/s: 36 rss: 73Mb L: 49/50 MS: 1 InsertByte- 00:08:02.355 [2024-11-17 08:20:15.279958] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:258385407010406 len:1 00:08:02.355 [2024-11-17 08:20:15.279995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.355 [2024-11-17 08:20:15.280094] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:60160 len:1 00:08:02.355 [2024-11-17 08:20:15.280119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.355 [2024-11-17 08:20:15.280228] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15400960 len:166 00:08:02.355 [2024-11-17 08:20:15.280254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.355 #37 NEW cov: 12443 ft: 15246 corp: 28/924b lim: 50 exec/s: 37 rss: 74Mb L: 31/50 MS: 1 InsertByte- 00:08:02.355 [2024-11-17 08:20:15.350036] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:258385407010406 len:1 00:08:02.355 [2024-11-17 08:20:15.350070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.355 [2024-11-17 08:20:15.350172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:258934988341248 len:1 00:08:02.355 [2024-11-17 08:20:15.350194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.355 #38 NEW cov: 12443 ft: 15260 corp: 29/949b lim: 50 exec/s: 38 rss: 74Mb L: 25/50 MS: 1 ChangeBit- 00:08:02.355 [2024-11-17 08:20:15.400558] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069584977919 len:65536 00:08:02.355 [2024-11-17 08:20:15.400589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.355 [2024-11-17 08:20:15.400681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709504266 len:65536 00:08:02.355 [2024-11-17 08:20:15.400706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.355 [2024-11-17 08:20:15.400819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:02.355 [2024-11-17 08:20:15.400837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.355 [2024-11-17 08:20:15.400956] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:02.355 [2024-11-17 08:20:15.400978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.355 #39 NEW cov: 12443 ft: 15270 corp: 30/993b lim: 50 exec/s: 19 rss: 74Mb L: 44/50 MS: 1 CrossOver- 00:08:02.355 #39 DONE cov: 12443 ft: 15270 corp: 30/993b lim: 50 exec/s: 19 rss: 74Mb 00:08:02.355 ###### Recommended dictionary. ###### 00:08:02.355 "\377\377\377\377\377\377\377G" # Uses: 1 00:08:02.355 "\355\207W\340\231\226\212\000" # Uses: 0 00:08:02.355 ###### End of recommended dictionary. ###### 00:08:02.355 Done 39 runs in 2 second(s) 00:08:02.615 08:20:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_19.conf /var/tmp/suppress_nvmf_fuzz 00:08:02.615 08:20:15 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:02.615 08:20:15 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:02.615 08:20:15 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:02.615 08:20:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:02.615 08:20:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:02.615 08:20:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:02.615 08:20:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:02.615 08:20:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:02.615 08:20:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:02.615 08:20:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:02.615 08:20:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 20 00:08:02.615 08:20:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4420 00:08:02.615 08:20:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:02.615 08:20:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:02.615 08:20:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:02.615 08:20:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:02.615 08:20:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:02.615 08:20:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 00:08:02.615 [2024-11-17 08:20:15.594937] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:02.615 [2024-11-17 08:20:15.595005] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid998364 ] 00:08:02.875 [2024-11-17 08:20:15.770959] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.875 [2024-11-17 08:20:15.792646] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.875 [2024-11-17 08:20:15.844898] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:02.875 [2024-11-17 08:20:15.861166] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:02.875 INFO: Running with entropic power schedule (0xFF, 100). 00:08:02.875 INFO: Seed: 2725850568 00:08:02.875 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:02.875 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:02.875 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:02.875 INFO: A corpus is not provided, starting from an empty corpus 00:08:02.875 #2 INITED exec/s: 0 rss: 65Mb 00:08:02.875 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:02.875 This may also happen if the target rejected all inputs we tried so far 00:08:02.875 [2024-11-17 08:20:15.916551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:02.875 [2024-11-17 08:20:15.916581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.875 [2024-11-17 08:20:15.916645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:02.875 [2024-11-17 08:20:15.916664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.134 NEW_FUNC[1/716]: 0x47b978 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:03.135 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:03.135 #11 NEW cov: 12269 ft: 12268 corp: 2/47b lim: 90 exec/s: 0 rss: 72Mb L: 46/46 MS: 4 CMP-ChangeBinInt-ChangeBinInt-InsertRepeatedBytes- DE: "\177\000\000\000"- 00:08:03.135 [2024-11-17 08:20:16.227488] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:03.135 [2024-11-17 08:20:16.227525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.135 [2024-11-17 08:20:16.227590] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:03.135 [2024-11-17 08:20:16.227611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.135 #12 NEW cov: 12388 ft: 12937 corp: 3/93b lim: 90 exec/s: 0 rss: 72Mb L: 46/46 MS: 1 ShuffleBytes- 00:08:03.395 [2024-11-17 08:20:16.287378] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:03.395 [2024-11-17 08:20:16.287405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.395 #18 NEW cov: 12394 ft: 13940 corp: 4/127b lim: 90 exec/s: 0 rss: 72Mb L: 34/46 MS: 1 EraseBytes- 00:08:03.395 [2024-11-17 08:20:16.327487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:03.395 [2024-11-17 08:20:16.327515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.395 #22 NEW cov: 12479 ft: 14199 corp: 5/159b lim: 90 exec/s: 0 rss: 72Mb L: 32/46 MS: 4 CopyPart-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:03.395 [2024-11-17 08:20:16.367780] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:03.395 [2024-11-17 08:20:16.367809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.395 [2024-11-17 08:20:16.367864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:03.395 [2024-11-17 08:20:16.367881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.395 #23 NEW cov: 12479 ft: 14343 corp: 6/205b lim: 90 exec/s: 0 rss: 72Mb L: 46/46 MS: 1 ChangeBit- 00:08:03.395 [2024-11-17 08:20:16.407720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:03.395 [2024-11-17 08:20:16.407748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.395 #24 NEW cov: 12479 ft: 14486 corp: 7/237b lim: 90 exec/s: 0 rss: 72Mb L: 32/46 MS: 1 ChangeBit- 00:08:03.395 [2024-11-17 08:20:16.468177] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:03.395 [2024-11-17 08:20:16.468205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.395 [2024-11-17 08:20:16.468247] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:03.395 [2024-11-17 08:20:16.468262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.395 [2024-11-17 08:20:16.468315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:03.395 [2024-11-17 08:20:16.468329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.395 #25 NEW cov: 12479 ft: 14923 corp: 8/306b lim: 90 exec/s: 0 rss: 72Mb L: 69/69 MS: 1 InsertRepeatedBytes- 00:08:03.395 [2024-11-17 08:20:16.528254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:03.395 [2024-11-17 08:20:16.528282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.395 [2024-11-17 08:20:16.528336] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:03.395 [2024-11-17 08:20:16.528352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.655 #26 NEW cov: 12479 ft: 14967 corp: 9/346b lim: 90 exec/s: 0 rss: 72Mb L: 40/69 MS: 1 CMP- DE: "\003\000\000\000\000\000\000\000"- 00:08:03.655 [2024-11-17 08:20:16.568305] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:03.655 [2024-11-17 08:20:16.568333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.655 [2024-11-17 08:20:16.568377] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:03.655 [2024-11-17 08:20:16.568392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.655 #27 NEW cov: 12479 ft: 14990 corp: 10/392b lim: 90 exec/s: 0 rss: 72Mb L: 46/69 MS: 1 ChangeByte- 00:08:03.655 [2024-11-17 08:20:16.628345] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:03.655 [2024-11-17 08:20:16.628373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.655 #28 NEW cov: 12479 ft: 15067 corp: 11/425b lim: 90 exec/s: 0 rss: 72Mb L: 33/69 MS: 1 InsertByte- 00:08:03.655 [2024-11-17 08:20:16.668634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:03.655 [2024-11-17 08:20:16.668662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.655 [2024-11-17 08:20:16.668721] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:03.655 [2024-11-17 08:20:16.668738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.655 #29 NEW cov: 12479 ft: 15151 corp: 12/471b lim: 90 exec/s: 0 rss: 72Mb L: 46/69 MS: 1 ChangeByte- 00:08:03.655 [2024-11-17 08:20:16.728786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:03.655 [2024-11-17 08:20:16.728813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.655 [2024-11-17 08:20:16.728853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:03.655 [2024-11-17 08:20:16.728869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.655 #30 NEW cov: 12479 ft: 15198 corp: 13/517b lim: 90 exec/s: 0 rss: 73Mb L: 46/69 MS: 1 CopyPart- 00:08:03.655 [2024-11-17 08:20:16.789003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:03.655 [2024-11-17 08:20:16.789030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.655 [2024-11-17 08:20:16.789073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:03.655 [2024-11-17 08:20:16.789089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.915 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:03.915 #31 NEW cov: 12502 ft: 15232 corp: 14/563b lim: 90 exec/s: 0 rss: 73Mb L: 46/69 MS: 1 ChangeBit- 00:08:03.915 [2024-11-17 08:20:16.829047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:03.915 [2024-11-17 08:20:16.829073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.915 [2024-11-17 08:20:16.829112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:03.915 [2024-11-17 08:20:16.829128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.915 #37 NEW cov: 12502 ft: 15251 corp: 15/610b lim: 90 exec/s: 0 rss: 73Mb L: 47/69 MS: 1 InsertByte- 00:08:03.915 [2024-11-17 08:20:16.889096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:03.915 [2024-11-17 08:20:16.889124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.915 #38 NEW cov: 12502 ft: 15291 corp: 16/642b lim: 90 exec/s: 38 rss: 73Mb L: 32/69 MS: 1 ChangeByte- 00:08:03.915 [2024-11-17 08:20:16.929341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:03.915 [2024-11-17 08:20:16.929369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.915 [2024-11-17 08:20:16.929438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:03.915 [2024-11-17 08:20:16.929452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.915 #39 NEW cov: 12502 ft: 15303 corp: 17/688b lim: 90 exec/s: 39 rss: 73Mb L: 46/69 MS: 1 PersAutoDict- DE: "\177\000\000\000"- 00:08:03.915 [2024-11-17 08:20:16.969487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:03.915 [2024-11-17 08:20:16.969515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.915 [2024-11-17 08:20:16.969565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:03.915 [2024-11-17 08:20:16.969579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.915 #40 NEW cov: 12502 ft: 15328 corp: 18/734b lim: 90 exec/s: 40 rss: 73Mb L: 46/69 MS: 1 ShuffleBytes- 00:08:03.915 [2024-11-17 08:20:17.009574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:03.915 [2024-11-17 08:20:17.009601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.915 [2024-11-17 08:20:17.009641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:03.915 [2024-11-17 08:20:17.009657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.915 #41 NEW cov: 12502 ft: 15360 corp: 19/782b lim: 90 exec/s: 41 rss: 73Mb L: 48/69 MS: 1 InsertByte- 00:08:04.175 [2024-11-17 08:20:17.069746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.175 [2024-11-17 08:20:17.069772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.175 [2024-11-17 08:20:17.069810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:04.175 [2024-11-17 08:20:17.069826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.175 #42 NEW cov: 12502 ft: 15370 corp: 20/828b lim: 90 exec/s: 42 rss: 73Mb L: 46/69 MS: 1 PersAutoDict- DE: "\177\000\000\000"- 00:08:04.175 [2024-11-17 08:20:17.110028] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.175 [2024-11-17 08:20:17.110059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.175 [2024-11-17 08:20:17.110097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:04.175 [2024-11-17 08:20:17.110113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.175 [2024-11-17 08:20:17.110169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:04.175 [2024-11-17 08:20:17.110186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.175 #43 NEW cov: 12502 ft: 15480 corp: 21/882b lim: 90 exec/s: 43 rss: 73Mb L: 54/69 MS: 1 InsertRepeatedBytes- 00:08:04.175 [2024-11-17 08:20:17.170010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.175 [2024-11-17 08:20:17.170039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.175 [2024-11-17 08:20:17.170093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:04.175 [2024-11-17 08:20:17.170107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.175 #44 NEW cov: 12502 ft: 15492 corp: 22/932b lim: 90 exec/s: 44 rss: 73Mb L: 50/69 MS: 1 PersAutoDict- DE: "\177\000\000\000"- 00:08:04.175 [2024-11-17 08:20:17.210190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.175 [2024-11-17 08:20:17.210218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.175 [2024-11-17 08:20:17.210270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:04.175 [2024-11-17 08:20:17.210286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.175 #45 NEW cov: 12502 ft: 15502 corp: 23/978b lim: 90 exec/s: 45 rss: 73Mb L: 46/69 MS: 1 ChangeBinInt- 00:08:04.175 [2024-11-17 08:20:17.270146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.175 [2024-11-17 08:20:17.270174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.175 #46 NEW cov: 12502 ft: 15549 corp: 24/1004b lim: 90 exec/s: 46 rss: 73Mb L: 26/69 MS: 1 EraseBytes- 00:08:04.175 [2024-11-17 08:20:17.310455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.175 [2024-11-17 08:20:17.310482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.175 [2024-11-17 08:20:17.310523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:04.175 [2024-11-17 08:20:17.310539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.434 #47 NEW cov: 12502 ft: 15563 corp: 25/1045b lim: 90 exec/s: 47 rss: 73Mb L: 41/69 MS: 1 PersAutoDict- DE: "\003\000\000\000\000\000\000\000"- 00:08:04.434 [2024-11-17 08:20:17.370734] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.434 [2024-11-17 08:20:17.370761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.434 [2024-11-17 08:20:17.370828] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:04.434 [2024-11-17 08:20:17.370844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.434 [2024-11-17 08:20:17.370899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:04.434 [2024-11-17 08:20:17.370919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.434 #48 NEW cov: 12502 ft: 15586 corp: 26/1107b lim: 90 exec/s: 48 rss: 73Mb L: 62/69 MS: 1 CrossOver- 00:08:04.434 [2024-11-17 08:20:17.410690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.434 [2024-11-17 08:20:17.410723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.434 [2024-11-17 08:20:17.410776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:04.434 [2024-11-17 08:20:17.410793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.434 #49 NEW cov: 12502 ft: 15597 corp: 27/1154b lim: 90 exec/s: 49 rss: 73Mb L: 47/69 MS: 1 InsertByte- 00:08:04.434 [2024-11-17 08:20:17.470887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.434 [2024-11-17 08:20:17.470914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.434 [2024-11-17 08:20:17.470962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:04.434 [2024-11-17 08:20:17.470978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.434 #50 NEW cov: 12502 ft: 15614 corp: 28/1200b lim: 90 exec/s: 50 rss: 73Mb L: 46/69 MS: 1 CrossOver- 00:08:04.434 [2024-11-17 08:20:17.510841] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.434 [2024-11-17 08:20:17.510868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.434 #51 NEW cov: 12502 ft: 15742 corp: 29/1232b lim: 90 exec/s: 51 rss: 73Mb L: 32/69 MS: 1 ChangeByte- 00:08:04.434 [2024-11-17 08:20:17.550955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.434 [2024-11-17 08:20:17.550981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.693 #52 NEW cov: 12502 ft: 15748 corp: 30/1251b lim: 90 exec/s: 52 rss: 73Mb L: 19/69 MS: 1 CrossOver- 00:08:04.693 [2024-11-17 08:20:17.591083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.693 [2024-11-17 08:20:17.591110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.693 #53 NEW cov: 12502 ft: 15793 corp: 31/1278b lim: 90 exec/s: 53 rss: 73Mb L: 27/69 MS: 1 CMP- DE: "H\000\000\000\000\000\000\000"- 00:08:04.693 [2024-11-17 08:20:17.651778] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.693 [2024-11-17 08:20:17.651805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.693 [2024-11-17 08:20:17.651878] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:04.693 [2024-11-17 08:20:17.651894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.693 [2024-11-17 08:20:17.651948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:04.693 [2024-11-17 08:20:17.651962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.693 [2024-11-17 08:20:17.652016] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:04.693 [2024-11-17 08:20:17.652031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.693 #59 NEW cov: 12502 ft: 16170 corp: 32/1351b lim: 90 exec/s: 59 rss: 74Mb L: 73/73 MS: 1 CrossOver- 00:08:04.693 [2024-11-17 08:20:17.711778] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.693 [2024-11-17 08:20:17.711805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.693 [2024-11-17 08:20:17.711849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:04.693 [2024-11-17 08:20:17.711864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.693 [2024-11-17 08:20:17.711920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:04.693 [2024-11-17 08:20:17.711935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.693 #60 NEW cov: 12502 ft: 16192 corp: 33/1418b lim: 90 exec/s: 60 rss: 74Mb L: 67/73 MS: 1 CrossOver- 00:08:04.693 [2024-11-17 08:20:17.751698] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.693 [2024-11-17 08:20:17.751724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.693 [2024-11-17 08:20:17.751764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:04.693 [2024-11-17 08:20:17.751779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.693 #61 NEW cov: 12502 ft: 16269 corp: 34/1465b lim: 90 exec/s: 61 rss: 74Mb L: 47/73 MS: 1 InsertByte- 00:08:04.693 [2024-11-17 08:20:17.791661] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.693 [2024-11-17 08:20:17.791689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.953 [2024-11-17 08:20:17.831808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.953 [2024-11-17 08:20:17.831836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.953 #63 NEW cov: 12502 ft: 16274 corp: 35/1484b lim: 90 exec/s: 63 rss: 74Mb L: 19/73 MS: 2 CopyPart-ChangeBinInt- 00:08:04.953 [2024-11-17 08:20:17.871937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.953 [2024-11-17 08:20:17.871965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.953 #64 pulse cov: 12502 ft: 16302 corp: 35/1484b lim: 90 exec/s: 32 rss: 74Mb 00:08:04.953 #64 NEW cov: 12502 ft: 16302 corp: 36/1516b lim: 90 exec/s: 32 rss: 74Mb L: 32/73 MS: 1 ChangeBinInt- 00:08:04.953 #64 DONE cov: 12502 ft: 16302 corp: 36/1516b lim: 90 exec/s: 32 rss: 74Mb 00:08:04.953 ###### Recommended dictionary. ###### 00:08:04.953 "\177\000\000\000" # Uses: 6 00:08:04.953 "\003\000\000\000\000\000\000\000" # Uses: 1 00:08:04.953 "H\000\000\000\000\000\000\000" # Uses: 1 00:08:04.953 ###### End of recommended dictionary. ###### 00:08:04.953 Done 64 runs in 2 second(s) 00:08:04.953 08:20:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_20.conf /var/tmp/suppress_nvmf_fuzz 00:08:04.953 08:20:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:04.953 08:20:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:04.953 08:20:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:04.953 08:20:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:04.953 08:20:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:04.953 08:20:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:04.953 08:20:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:04.953 08:20:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:04.953 08:20:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:04.953 08:20:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:04.953 08:20:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 21 00:08:04.953 08:20:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4421 00:08:04.953 08:20:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:04.953 08:20:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:04.953 08:20:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:04.953 08:20:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:04.953 08:20:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:04.953 08:20:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 00:08:04.953 [2024-11-17 08:20:18.065540] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:04.953 [2024-11-17 08:20:18.065616] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid998660 ] 00:08:05.212 [2024-11-17 08:20:18.247204] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.212 [2024-11-17 08:20:18.269764] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.212 [2024-11-17 08:20:18.322423] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:05.212 [2024-11-17 08:20:18.338746] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:05.472 INFO: Running with entropic power schedule (0xFF, 100). 00:08:05.472 INFO: Seed: 906895798 00:08:05.472 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:05.472 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:05.472 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:05.472 INFO: A corpus is not provided, starting from an empty corpus 00:08:05.472 #2 INITED exec/s: 0 rss: 66Mb 00:08:05.472 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:05.472 This may also happen if the target rejected all inputs we tried so far 00:08:05.472 [2024-11-17 08:20:18.384411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:05.472 [2024-11-17 08:20:18.384441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.472 [2024-11-17 08:20:18.384477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:05.472 [2024-11-17 08:20:18.384493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.472 [2024-11-17 08:20:18.384546] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:05.472 [2024-11-17 08:20:18.384562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.472 [2024-11-17 08:20:18.384614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:05.472 [2024-11-17 08:20:18.384629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:05.732 NEW_FUNC[1/716]: 0x47eba8 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:05.732 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:05.732 #5 NEW cov: 12250 ft: 12249 corp: 2/42b lim: 50 exec/s: 0 rss: 72Mb L: 41/41 MS: 3 ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:08:05.732 [2024-11-17 08:20:18.714847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:05.732 [2024-11-17 08:20:18.714881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.732 #7 NEW cov: 12363 ft: 13635 corp: 3/52b lim: 50 exec/s: 0 rss: 72Mb L: 10/41 MS: 2 InsertByte-CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:05.732 [2024-11-17 08:20:18.754899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:05.732 [2024-11-17 08:20:18.754928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.732 #8 NEW cov: 12369 ft: 13784 corp: 4/62b lim: 50 exec/s: 0 rss: 72Mb L: 10/41 MS: 1 CrossOver- 00:08:05.732 [2024-11-17 08:20:18.815510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:05.732 [2024-11-17 08:20:18.815537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.732 [2024-11-17 08:20:18.815605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:05.732 [2024-11-17 08:20:18.815623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.732 [2024-11-17 08:20:18.815676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:05.732 [2024-11-17 08:20:18.815692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.732 [2024-11-17 08:20:18.815755] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:05.732 [2024-11-17 08:20:18.815782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:05.732 #9 NEW cov: 12454 ft: 13990 corp: 5/103b lim: 50 exec/s: 0 rss: 72Mb L: 41/41 MS: 1 CopyPart- 00:08:05.994 [2024-11-17 08:20:18.875704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:05.994 [2024-11-17 08:20:18.875734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.994 [2024-11-17 08:20:18.875791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:05.994 [2024-11-17 08:20:18.875806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.994 [2024-11-17 08:20:18.875859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:05.994 [2024-11-17 08:20:18.875875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.994 [2024-11-17 08:20:18.875929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:05.994 [2024-11-17 08:20:18.875945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:05.994 #10 NEW cov: 12454 ft: 14163 corp: 6/144b lim: 50 exec/s: 0 rss: 72Mb L: 41/41 MS: 1 ChangeBit- 00:08:05.994 [2024-11-17 08:20:18.915302] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:05.994 [2024-11-17 08:20:18.915329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.994 #11 NEW cov: 12454 ft: 14286 corp: 7/154b lim: 50 exec/s: 0 rss: 73Mb L: 10/41 MS: 1 ShuffleBytes- 00:08:05.994 [2024-11-17 08:20:18.975976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:05.994 [2024-11-17 08:20:18.976004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.994 [2024-11-17 08:20:18.976073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:05.994 [2024-11-17 08:20:18.976089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.994 [2024-11-17 08:20:18.976141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:05.994 [2024-11-17 08:20:18.976157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.994 [2024-11-17 08:20:18.976211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:05.994 [2024-11-17 08:20:18.976227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:05.994 #12 NEW cov: 12454 ft: 14347 corp: 8/195b lim: 50 exec/s: 0 rss: 73Mb L: 41/41 MS: 1 CrossOver- 00:08:05.994 [2024-11-17 08:20:19.035632] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:05.994 [2024-11-17 08:20:19.035660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.994 #13 NEW cov: 12454 ft: 14413 corp: 9/214b lim: 50 exec/s: 0 rss: 73Mb L: 19/41 MS: 1 CrossOver- 00:08:05.994 [2024-11-17 08:20:19.075779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:05.994 [2024-11-17 08:20:19.075809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.994 #14 NEW cov: 12454 ft: 14480 corp: 10/224b lim: 50 exec/s: 0 rss: 73Mb L: 10/41 MS: 1 CopyPart- 00:08:05.994 [2024-11-17 08:20:19.115863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:05.994 [2024-11-17 08:20:19.115891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.259 #15 NEW cov: 12454 ft: 14590 corp: 11/234b lim: 50 exec/s: 0 rss: 73Mb L: 10/41 MS: 1 ChangeBinInt- 00:08:06.259 [2024-11-17 08:20:19.176058] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:06.259 [2024-11-17 08:20:19.176085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.259 #16 NEW cov: 12454 ft: 14624 corp: 12/244b lim: 50 exec/s: 0 rss: 73Mb L: 10/41 MS: 1 CrossOver- 00:08:06.259 [2024-11-17 08:20:19.216610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:06.259 [2024-11-17 08:20:19.216637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.259 [2024-11-17 08:20:19.216700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:06.259 [2024-11-17 08:20:19.216715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.259 [2024-11-17 08:20:19.216768] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:06.259 [2024-11-17 08:20:19.216784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.259 [2024-11-17 08:20:19.216837] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:06.259 [2024-11-17 08:20:19.216853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.259 #17 NEW cov: 12454 ft: 14649 corp: 13/288b lim: 50 exec/s: 0 rss: 73Mb L: 44/44 MS: 1 InsertRepeatedBytes- 00:08:06.259 [2024-11-17 08:20:19.276793] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:06.259 [2024-11-17 08:20:19.276821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.259 [2024-11-17 08:20:19.276901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:06.259 [2024-11-17 08:20:19.276917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.259 [2024-11-17 08:20:19.276971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:06.259 [2024-11-17 08:20:19.276987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.260 [2024-11-17 08:20:19.277040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:06.260 [2024-11-17 08:20:19.277056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.260 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:06.260 #18 NEW cov: 12477 ft: 14747 corp: 14/329b lim: 50 exec/s: 0 rss: 73Mb L: 41/44 MS: 1 ChangeBit- 00:08:06.260 [2024-11-17 08:20:19.316431] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:06.260 [2024-11-17 08:20:19.316458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.260 #19 NEW cov: 12477 ft: 14791 corp: 15/340b lim: 50 exec/s: 0 rss: 73Mb L: 11/44 MS: 1 InsertByte- 00:08:06.260 [2024-11-17 08:20:19.376635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:06.260 [2024-11-17 08:20:19.376663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.519 #20 NEW cov: 12477 ft: 14808 corp: 16/350b lim: 50 exec/s: 20 rss: 73Mb L: 10/44 MS: 1 ChangeBinInt- 00:08:06.519 [2024-11-17 08:20:19.417208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:06.519 [2024-11-17 08:20:19.417236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.519 [2024-11-17 08:20:19.417288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:06.519 [2024-11-17 08:20:19.417303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.519 [2024-11-17 08:20:19.417354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:06.519 [2024-11-17 08:20:19.417370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.519 [2024-11-17 08:20:19.417421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:06.519 [2024-11-17 08:20:19.417437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.519 #21 NEW cov: 12477 ft: 14819 corp: 17/391b lim: 50 exec/s: 21 rss: 73Mb L: 41/44 MS: 1 ChangeBit- 00:08:06.519 [2024-11-17 08:20:19.456836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:06.519 [2024-11-17 08:20:19.456863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.519 #22 NEW cov: 12477 ft: 14873 corp: 18/401b lim: 50 exec/s: 22 rss: 73Mb L: 10/44 MS: 1 ShuffleBytes- 00:08:06.519 [2024-11-17 08:20:19.517491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:06.519 [2024-11-17 08:20:19.517518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.519 [2024-11-17 08:20:19.517585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:06.519 [2024-11-17 08:20:19.517602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.519 [2024-11-17 08:20:19.517654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:06.519 [2024-11-17 08:20:19.517670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.519 [2024-11-17 08:20:19.517728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:06.519 [2024-11-17 08:20:19.517744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.519 #23 NEW cov: 12477 ft: 14882 corp: 19/443b lim: 50 exec/s: 23 rss: 73Mb L: 42/44 MS: 1 InsertByte- 00:08:06.520 [2024-11-17 08:20:19.577630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:06.520 [2024-11-17 08:20:19.577657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.520 [2024-11-17 08:20:19.577729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:06.520 [2024-11-17 08:20:19.577747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.520 [2024-11-17 08:20:19.577809] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:06.520 [2024-11-17 08:20:19.577825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.520 [2024-11-17 08:20:19.577882] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:06.520 [2024-11-17 08:20:19.577898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.520 #24 NEW cov: 12477 ft: 14898 corp: 20/484b lim: 50 exec/s: 24 rss: 73Mb L: 41/44 MS: 1 ChangeBinInt- 00:08:06.520 [2024-11-17 08:20:19.617264] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:06.520 [2024-11-17 08:20:19.617292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.779 #25 NEW cov: 12477 ft: 14939 corp: 21/502b lim: 50 exec/s: 25 rss: 73Mb L: 18/44 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:06.779 [2024-11-17 08:20:19.677468] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:06.779 [2024-11-17 08:20:19.677496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.779 #26 NEW cov: 12477 ft: 14983 corp: 22/520b lim: 50 exec/s: 26 rss: 74Mb L: 18/44 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:06.779 [2024-11-17 08:20:19.737644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:06.779 [2024-11-17 08:20:19.737674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.779 [2024-11-17 08:20:19.777727] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:06.779 [2024-11-17 08:20:19.777755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.779 #28 NEW cov: 12477 ft: 15024 corp: 23/530b lim: 50 exec/s: 28 rss: 74Mb L: 10/44 MS: 2 ChangeBinInt-ChangeBinInt- 00:08:06.779 [2024-11-17 08:20:19.817904] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:06.779 [2024-11-17 08:20:19.817932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.779 #29 NEW cov: 12477 ft: 15031 corp: 24/540b lim: 50 exec/s: 29 rss: 74Mb L: 10/44 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:06.779 [2024-11-17 08:20:19.857985] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:06.779 [2024-11-17 08:20:19.858013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.779 #30 NEW cov: 12477 ft: 15053 corp: 25/559b lim: 50 exec/s: 30 rss: 74Mb L: 19/44 MS: 1 CrossOver- 00:08:07.039 [2024-11-17 08:20:19.918168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.039 [2024-11-17 08:20:19.918198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.039 #31 NEW cov: 12477 ft: 15099 corp: 26/569b lim: 50 exec/s: 31 rss: 74Mb L: 10/44 MS: 1 ShuffleBytes- 00:08:07.039 [2024-11-17 08:20:19.958279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.039 [2024-11-17 08:20:19.958308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.039 #32 NEW cov: 12477 ft: 15105 corp: 27/579b lim: 50 exec/s: 32 rss: 74Mb L: 10/44 MS: 1 ChangeByte- 00:08:07.039 [2024-11-17 08:20:19.998865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.039 [2024-11-17 08:20:19.998893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.039 [2024-11-17 08:20:19.998961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:07.039 [2024-11-17 08:20:19.998978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.039 [2024-11-17 08:20:19.999028] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:07.039 [2024-11-17 08:20:19.999045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.039 [2024-11-17 08:20:19.999097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:07.039 [2024-11-17 08:20:19.999111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.039 #33 NEW cov: 12477 ft: 15175 corp: 28/621b lim: 50 exec/s: 33 rss: 74Mb L: 42/44 MS: 1 InsertByte- 00:08:07.039 [2024-11-17 08:20:20.058612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.039 [2024-11-17 08:20:20.058641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.039 #34 NEW cov: 12477 ft: 15209 corp: 29/631b lim: 50 exec/s: 34 rss: 74Mb L: 10/44 MS: 1 ChangeBit- 00:08:07.039 [2024-11-17 08:20:20.119227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.039 [2024-11-17 08:20:20.119256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.039 [2024-11-17 08:20:20.119304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:07.039 [2024-11-17 08:20:20.119320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.039 [2024-11-17 08:20:20.119373] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:07.039 [2024-11-17 08:20:20.119387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.039 [2024-11-17 08:20:20.119446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:07.039 [2024-11-17 08:20:20.119462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.039 #35 NEW cov: 12477 ft: 15236 corp: 30/672b lim: 50 exec/s: 35 rss: 74Mb L: 41/44 MS: 1 ChangeBit- 00:08:07.039 [2024-11-17 08:20:20.159069] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.039 [2024-11-17 08:20:20.159098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.299 #36 NEW cov: 12477 ft: 15270 corp: 31/682b lim: 50 exec/s: 36 rss: 74Mb L: 10/44 MS: 1 InsertRepeatedBytes- 00:08:07.299 [2024-11-17 08:20:20.198969] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.299 [2024-11-17 08:20:20.198997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.299 #37 NEW cov: 12477 ft: 15282 corp: 32/692b lim: 50 exec/s: 37 rss: 74Mb L: 10/44 MS: 1 CopyPart- 00:08:07.299 [2024-11-17 08:20:20.239516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.299 [2024-11-17 08:20:20.239543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.299 [2024-11-17 08:20:20.239599] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:07.299 [2024-11-17 08:20:20.239615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.299 [2024-11-17 08:20:20.239670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:07.299 [2024-11-17 08:20:20.239687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.299 [2024-11-17 08:20:20.239746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:07.299 [2024-11-17 08:20:20.239762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.299 #38 NEW cov: 12477 ft: 15307 corp: 33/738b lim: 50 exec/s: 38 rss: 74Mb L: 46/46 MS: 1 CopyPart- 00:08:07.299 [2024-11-17 08:20:20.299364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.299 [2024-11-17 08:20:20.299392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.299 [2024-11-17 08:20:20.299436] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:07.299 [2024-11-17 08:20:20.299452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.299 #39 NEW cov: 12477 ft: 15650 corp: 34/758b lim: 50 exec/s: 39 rss: 74Mb L: 20/46 MS: 1 CopyPart- 00:08:07.299 [2024-11-17 08:20:20.359407] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.299 [2024-11-17 08:20:20.359436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.299 #44 NEW cov: 12477 ft: 15757 corp: 35/769b lim: 50 exec/s: 22 rss: 74Mb L: 11/46 MS: 5 CrossOver-PersAutoDict-CrossOver-EraseBytes-CopyPart- DE: "\000\000\000\000\000\000\000\000"- 00:08:07.299 #44 DONE cov: 12477 ft: 15757 corp: 35/769b lim: 50 exec/s: 22 rss: 74Mb 00:08:07.299 ###### Recommended dictionary. ###### 00:08:07.299 "\000\000\000\000\000\000\000\000" # Uses: 4 00:08:07.299 ###### End of recommended dictionary. ###### 00:08:07.299 Done 44 runs in 2 second(s) 00:08:07.559 08:20:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_21.conf /var/tmp/suppress_nvmf_fuzz 00:08:07.559 08:20:20 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:07.559 08:20:20 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:07.559 08:20:20 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:07.559 08:20:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:07.559 08:20:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:07.559 08:20:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:07.559 08:20:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:07.559 08:20:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:07.559 08:20:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:07.559 08:20:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:07.559 08:20:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 22 00:08:07.559 08:20:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4422 00:08:07.559 08:20:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:07.559 08:20:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:07.559 08:20:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:07.559 08:20:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:07.559 08:20:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:07.559 08:20:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 00:08:07.559 [2024-11-17 08:20:20.536263] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:07.559 [2024-11-17 08:20:20.536367] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid999186 ] 00:08:07.819 [2024-11-17 08:20:20.719616] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.819 [2024-11-17 08:20:20.742079] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.819 [2024-11-17 08:20:20.794760] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:07.819 [2024-11-17 08:20:20.811080] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:07.819 INFO: Running with entropic power schedule (0xFF, 100). 00:08:07.819 INFO: Seed: 3380888403 00:08:07.819 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:07.819 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:07.819 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:07.819 INFO: A corpus is not provided, starting from an empty corpus 00:08:07.819 #2 INITED exec/s: 0 rss: 65Mb 00:08:07.819 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:07.819 This may also happen if the target rejected all inputs we tried so far 00:08:07.819 [2024-11-17 08:20:20.866517] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:07.819 [2024-11-17 08:20:20.866554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.819 [2024-11-17 08:20:20.866620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:07.819 [2024-11-17 08:20:20.866646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.079 NEW_FUNC[1/716]: 0x480e78 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:08.079 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:08.079 #12 NEW cov: 12276 ft: 12275 corp: 2/49b lim: 85 exec/s: 0 rss: 72Mb L: 48/48 MS: 5 ChangeByte-ChangeBit-ChangeBit-CrossOver-InsertRepeatedBytes- 00:08:08.079 [2024-11-17 08:20:21.207368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:08.079 [2024-11-17 08:20:21.207426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.338 #17 NEW cov: 12389 ft: 13752 corp: 3/77b lim: 85 exec/s: 0 rss: 72Mb L: 28/48 MS: 5 InsertByte-CrossOver-ChangeBinInt-ShuffleBytes-InsertRepeatedBytes- 00:08:08.338 [2024-11-17 08:20:21.257395] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:08.339 [2024-11-17 08:20:21.257423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.339 [2024-11-17 08:20:21.257473] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:08.339 [2024-11-17 08:20:21.257487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.339 #20 NEW cov: 12395 ft: 14156 corp: 4/126b lim: 85 exec/s: 0 rss: 72Mb L: 49/49 MS: 3 CrossOver-InsertByte-InsertRepeatedBytes- 00:08:08.339 [2024-11-17 08:20:21.297356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:08.339 [2024-11-17 08:20:21.297383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.339 #21 NEW cov: 12480 ft: 14396 corp: 5/155b lim: 85 exec/s: 0 rss: 72Mb L: 29/49 MS: 1 InsertByte- 00:08:08.339 [2024-11-17 08:20:21.357674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:08.339 [2024-11-17 08:20:21.357707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.339 [2024-11-17 08:20:21.357745] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:08.339 [2024-11-17 08:20:21.357761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.339 #22 NEW cov: 12480 ft: 14438 corp: 6/203b lim: 85 exec/s: 0 rss: 72Mb L: 48/49 MS: 1 CopyPart- 00:08:08.339 [2024-11-17 08:20:21.417706] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:08.339 [2024-11-17 08:20:21.417734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.339 #23 NEW cov: 12480 ft: 14560 corp: 7/227b lim: 85 exec/s: 0 rss: 72Mb L: 24/49 MS: 1 EraseBytes- 00:08:08.598 [2024-11-17 08:20:21.477883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:08.598 [2024-11-17 08:20:21.477911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.598 #24 NEW cov: 12480 ft: 14636 corp: 8/255b lim: 85 exec/s: 0 rss: 72Mb L: 28/49 MS: 1 ChangeBinInt- 00:08:08.598 [2024-11-17 08:20:21.517987] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:08.598 [2024-11-17 08:20:21.518014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.598 #25 NEW cov: 12480 ft: 14659 corp: 9/283b lim: 85 exec/s: 0 rss: 72Mb L: 28/49 MS: 1 ShuffleBytes- 00:08:08.598 [2024-11-17 08:20:21.558237] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:08.598 [2024-11-17 08:20:21.558267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.598 [2024-11-17 08:20:21.558322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:08.598 [2024-11-17 08:20:21.558339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.598 #26 NEW cov: 12480 ft: 14700 corp: 10/319b lim: 85 exec/s: 0 rss: 72Mb L: 36/49 MS: 1 CMP- DE: "k\2222\303\230\226\212\000"- 00:08:08.598 [2024-11-17 08:20:21.598339] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:08.598 [2024-11-17 08:20:21.598366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.598 [2024-11-17 08:20:21.598403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:08.598 [2024-11-17 08:20:21.598419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.598 #27 NEW cov: 12480 ft: 14743 corp: 11/353b lim: 85 exec/s: 0 rss: 72Mb L: 34/49 MS: 1 CopyPart- 00:08:08.598 [2024-11-17 08:20:21.638408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:08.598 [2024-11-17 08:20:21.638435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.598 [2024-11-17 08:20:21.638473] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:08.598 [2024-11-17 08:20:21.638487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.598 #28 NEW cov: 12480 ft: 14761 corp: 12/401b lim: 85 exec/s: 0 rss: 72Mb L: 48/49 MS: 1 ChangeBinInt- 00:08:08.598 [2024-11-17 08:20:21.678718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:08.598 [2024-11-17 08:20:21.678745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.598 [2024-11-17 08:20:21.678783] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:08.598 [2024-11-17 08:20:21.678799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.598 [2024-11-17 08:20:21.678851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:08.598 [2024-11-17 08:20:21.678866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.598 #29 NEW cov: 12480 ft: 15124 corp: 13/458b lim: 85 exec/s: 0 rss: 72Mb L: 57/57 MS: 1 CrossOver- 00:08:08.598 [2024-11-17 08:20:21.718670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:08.598 [2024-11-17 08:20:21.718700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.598 [2024-11-17 08:20:21.718756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:08.598 [2024-11-17 08:20:21.718773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.858 #30 NEW cov: 12480 ft: 15160 corp: 14/506b lim: 85 exec/s: 0 rss: 72Mb L: 48/57 MS: 1 PersAutoDict- DE: "k\2222\303\230\226\212\000"- 00:08:08.858 [2024-11-17 08:20:21.758666] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:08.858 [2024-11-17 08:20:21.758698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.858 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:08.858 #31 NEW cov: 12503 ft: 15223 corp: 15/534b lim: 85 exec/s: 0 rss: 72Mb L: 28/57 MS: 1 ChangeBinInt- 00:08:08.858 [2024-11-17 08:20:21.799017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:08.858 [2024-11-17 08:20:21.799043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.858 [2024-11-17 08:20:21.799085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:08.858 [2024-11-17 08:20:21.799100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.858 [2024-11-17 08:20:21.799153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:08.858 [2024-11-17 08:20:21.799168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.858 #32 NEW cov: 12503 ft: 15232 corp: 16/585b lim: 85 exec/s: 0 rss: 73Mb L: 51/57 MS: 1 InsertRepeatedBytes- 00:08:08.858 [2024-11-17 08:20:21.859057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:08.858 [2024-11-17 08:20:21.859083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.858 [2024-11-17 08:20:21.859123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:08.858 [2024-11-17 08:20:21.859137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.858 #38 NEW cov: 12503 ft: 15258 corp: 17/634b lim: 85 exec/s: 38 rss: 73Mb L: 49/57 MS: 1 ChangeByte- 00:08:08.858 [2024-11-17 08:20:21.919253] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:08.859 [2024-11-17 08:20:21.919278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.859 [2024-11-17 08:20:21.919342] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:08.859 [2024-11-17 08:20:21.919358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.859 #39 NEW cov: 12503 ft: 15353 corp: 18/674b lim: 85 exec/s: 39 rss: 73Mb L: 40/57 MS: 1 EraseBytes- 00:08:08.859 [2024-11-17 08:20:21.959460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:08.859 [2024-11-17 08:20:21.959486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.859 [2024-11-17 08:20:21.959523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:08.859 [2024-11-17 08:20:21.959538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.859 [2024-11-17 08:20:21.959592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:08.859 [2024-11-17 08:20:21.959608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.118 #40 NEW cov: 12503 ft: 15362 corp: 19/730b lim: 85 exec/s: 40 rss: 73Mb L: 56/57 MS: 1 InsertRepeatedBytes- 00:08:09.118 [2024-11-17 08:20:22.019656] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:09.118 [2024-11-17 08:20:22.019682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.118 [2024-11-17 08:20:22.019751] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:09.118 [2024-11-17 08:20:22.019767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.118 [2024-11-17 08:20:22.019823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:09.118 [2024-11-17 08:20:22.019839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.118 #41 NEW cov: 12503 ft: 15379 corp: 20/789b lim: 85 exec/s: 41 rss: 73Mb L: 59/59 MS: 1 InsertRepeatedBytes- 00:08:09.118 [2024-11-17 08:20:22.059576] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:09.118 [2024-11-17 08:20:22.059602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.118 [2024-11-17 08:20:22.059657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:09.118 [2024-11-17 08:20:22.059685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.118 #42 NEW cov: 12503 ft: 15387 corp: 21/839b lim: 85 exec/s: 42 rss: 73Mb L: 50/59 MS: 1 CrossOver- 00:08:09.118 [2024-11-17 08:20:22.119947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:09.118 [2024-11-17 08:20:22.119974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.119 [2024-11-17 08:20:22.120028] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:09.119 [2024-11-17 08:20:22.120044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.119 [2024-11-17 08:20:22.120098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:09.119 [2024-11-17 08:20:22.120112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.119 #43 NEW cov: 12503 ft: 15417 corp: 22/895b lim: 85 exec/s: 43 rss: 73Mb L: 56/59 MS: 1 ChangeBit- 00:08:09.119 [2024-11-17 08:20:22.180120] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:09.119 [2024-11-17 08:20:22.180146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.119 [2024-11-17 08:20:22.180211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:09.119 [2024-11-17 08:20:22.180227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.119 [2024-11-17 08:20:22.180282] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:09.119 [2024-11-17 08:20:22.180298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.119 #44 NEW cov: 12503 ft: 15434 corp: 23/960b lim: 85 exec/s: 44 rss: 73Mb L: 65/65 MS: 1 InsertRepeatedBytes- 00:08:09.119 [2024-11-17 08:20:22.219906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:09.119 [2024-11-17 08:20:22.219932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.378 #45 NEW cov: 12503 ft: 15451 corp: 24/980b lim: 85 exec/s: 45 rss: 73Mb L: 20/65 MS: 1 EraseBytes- 00:08:09.378 [2024-11-17 08:20:22.280401] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:09.378 [2024-11-17 08:20:22.280427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.378 [2024-11-17 08:20:22.280489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:09.378 [2024-11-17 08:20:22.280505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.378 [2024-11-17 08:20:22.280564] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:09.378 [2024-11-17 08:20:22.280579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.378 #46 NEW cov: 12503 ft: 15471 corp: 25/1036b lim: 85 exec/s: 46 rss: 73Mb L: 56/65 MS: 1 ShuffleBytes- 00:08:09.378 [2024-11-17 08:20:22.340551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:09.378 [2024-11-17 08:20:22.340576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.378 [2024-11-17 08:20:22.340639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:09.378 [2024-11-17 08:20:22.340655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.378 [2024-11-17 08:20:22.340710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:09.378 [2024-11-17 08:20:22.340727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.378 #47 NEW cov: 12503 ft: 15473 corp: 26/1087b lim: 85 exec/s: 47 rss: 73Mb L: 51/65 MS: 1 InsertByte- 00:08:09.378 [2024-11-17 08:20:22.400879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:09.378 [2024-11-17 08:20:22.400905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.378 [2024-11-17 08:20:22.400974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:09.378 [2024-11-17 08:20:22.400990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.378 [2024-11-17 08:20:22.401042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:09.378 [2024-11-17 08:20:22.401056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.378 [2024-11-17 08:20:22.401112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:09.378 [2024-11-17 08:20:22.401127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.378 #48 NEW cov: 12503 ft: 15836 corp: 27/1166b lim: 85 exec/s: 48 rss: 73Mb L: 79/79 MS: 1 CopyPart- 00:08:09.378 [2024-11-17 08:20:22.461033] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:09.378 [2024-11-17 08:20:22.461059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.378 [2024-11-17 08:20:22.461129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:09.378 [2024-11-17 08:20:22.461145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.378 [2024-11-17 08:20:22.461198] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:09.378 [2024-11-17 08:20:22.461214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.378 [2024-11-17 08:20:22.461269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:09.378 [2024-11-17 08:20:22.461285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.378 #49 NEW cov: 12503 ft: 15860 corp: 28/1235b lim: 85 exec/s: 49 rss: 74Mb L: 69/79 MS: 1 InsertRepeatedBytes- 00:08:09.638 [2024-11-17 08:20:22.520805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:09.638 [2024-11-17 08:20:22.520835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.638 #50 NEW cov: 12503 ft: 15886 corp: 29/1264b lim: 85 exec/s: 50 rss: 74Mb L: 29/79 MS: 1 ChangeByte- 00:08:09.638 [2024-11-17 08:20:22.560851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:09.638 [2024-11-17 08:20:22.560877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.638 #51 NEW cov: 12503 ft: 15929 corp: 30/1293b lim: 85 exec/s: 51 rss: 74Mb L: 29/79 MS: 1 EraseBytes- 00:08:09.638 [2024-11-17 08:20:22.621358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:09.638 [2024-11-17 08:20:22.621384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.638 [2024-11-17 08:20:22.621446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:09.638 [2024-11-17 08:20:22.621462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.638 [2024-11-17 08:20:22.621516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:09.638 [2024-11-17 08:20:22.621531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.638 #52 NEW cov: 12503 ft: 15958 corp: 31/1352b lim: 85 exec/s: 52 rss: 74Mb L: 59/79 MS: 1 PersAutoDict- DE: "k\2222\303\230\226\212\000"- 00:08:09.638 [2024-11-17 08:20:22.681407] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:09.638 [2024-11-17 08:20:22.681435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.638 [2024-11-17 08:20:22.681489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:09.638 [2024-11-17 08:20:22.681505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.638 #53 NEW cov: 12503 ft: 15969 corp: 32/1402b lim: 85 exec/s: 53 rss: 74Mb L: 50/79 MS: 1 ChangeBinInt- 00:08:09.638 [2024-11-17 08:20:22.721827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:09.638 [2024-11-17 08:20:22.721854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.638 [2024-11-17 08:20:22.721912] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:09.638 [2024-11-17 08:20:22.721927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.638 [2024-11-17 08:20:22.721980] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:09.638 [2024-11-17 08:20:22.721995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.638 [2024-11-17 08:20:22.722050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:09.638 [2024-11-17 08:20:22.722077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.638 #54 NEW cov: 12503 ft: 15986 corp: 33/1484b lim: 85 exec/s: 54 rss: 74Mb L: 82/82 MS: 1 InsertRepeatedBytes- 00:08:09.899 [2024-11-17 08:20:22.781868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:09.899 [2024-11-17 08:20:22.781896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.899 [2024-11-17 08:20:22.781949] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:09.899 [2024-11-17 08:20:22.781969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.899 [2024-11-17 08:20:22.782023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:09.899 [2024-11-17 08:20:22.782040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.899 #55 NEW cov: 12503 ft: 16026 corp: 34/1535b lim: 85 exec/s: 55 rss: 74Mb L: 51/82 MS: 1 CopyPart- 00:08:09.899 [2024-11-17 08:20:22.821633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:09.899 [2024-11-17 08:20:22.821662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.899 #56 NEW cov: 12503 ft: 16144 corp: 35/1563b lim: 85 exec/s: 56 rss: 74Mb L: 28/82 MS: 1 CMP- DE: "\000\000"- 00:08:09.899 [2024-11-17 08:20:22.861920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:09.899 [2024-11-17 08:20:22.861946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.899 [2024-11-17 08:20:22.862001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:09.899 [2024-11-17 08:20:22.862017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.899 #57 NEW cov: 12503 ft: 16173 corp: 36/1613b lim: 85 exec/s: 28 rss: 74Mb L: 50/82 MS: 1 ChangeBinInt- 00:08:09.899 #57 DONE cov: 12503 ft: 16173 corp: 36/1613b lim: 85 exec/s: 28 rss: 74Mb 00:08:09.899 ###### Recommended dictionary. ###### 00:08:09.899 "k\2222\303\230\226\212\000" # Uses: 2 00:08:09.899 "\000\000" # Uses: 0 00:08:09.899 ###### End of recommended dictionary. ###### 00:08:09.899 Done 57 runs in 2 second(s) 00:08:09.899 08:20:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_22.conf /var/tmp/suppress_nvmf_fuzz 00:08:09.899 08:20:22 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:09.899 08:20:22 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:09.899 08:20:22 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:09.899 08:20:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:09.899 08:20:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:09.899 08:20:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:09.899 08:20:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:09.899 08:20:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:09.899 08:20:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:09.899 08:20:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:09.899 08:20:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 23 00:08:09.899 08:20:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4423 00:08:09.899 08:20:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:09.899 08:20:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:09.899 08:20:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:09.899 08:20:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:09.899 08:20:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:09.899 08:20:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 00:08:09.899 [2024-11-17 08:20:23.034329] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:09.899 [2024-11-17 08:20:23.034413] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid999624 ] 00:08:10.159 [2024-11-17 08:20:23.210261] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.159 [2024-11-17 08:20:23.232220] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.159 [2024-11-17 08:20:23.284572] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:10.418 [2024-11-17 08:20:23.300894] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:10.418 INFO: Running with entropic power schedule (0xFF, 100). 00:08:10.418 INFO: Seed: 1574929104 00:08:10.418 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:10.418 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:10.418 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:10.418 INFO: A corpus is not provided, starting from an empty corpus 00:08:10.418 #2 INITED exec/s: 0 rss: 65Mb 00:08:10.418 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:10.418 This may also happen if the target rejected all inputs we tried so far 00:08:10.418 [2024-11-17 08:20:23.346089] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:10.418 [2024-11-17 08:20:23.346119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.678 NEW_FUNC[1/715]: 0x4840b8 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:10.678 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:10.678 #7 NEW cov: 12209 ft: 12203 corp: 2/7b lim: 25 exec/s: 0 rss: 72Mb L: 6/6 MS: 5 InsertByte-ChangeByte-ChangeByte-InsertByte-CopyPart- 00:08:10.678 [2024-11-17 08:20:23.676909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:10.678 [2024-11-17 08:20:23.676941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.678 #8 NEW cov: 12322 ft: 12673 corp: 3/13b lim: 25 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 ChangeBit- 00:08:10.678 [2024-11-17 08:20:23.737357] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:10.678 [2024-11-17 08:20:23.737385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.678 [2024-11-17 08:20:23.737435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:10.678 [2024-11-17 08:20:23.737450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.678 [2024-11-17 08:20:23.737500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:10.678 [2024-11-17 08:20:23.737515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.678 [2024-11-17 08:20:23.737568] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:10.678 [2024-11-17 08:20:23.737583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.678 #10 NEW cov: 12328 ft: 13408 corp: 4/34b lim: 25 exec/s: 0 rss: 72Mb L: 21/21 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:10.678 [2024-11-17 08:20:23.777065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:10.678 [2024-11-17 08:20:23.777092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.678 #11 NEW cov: 12413 ft: 13836 corp: 5/41b lim: 25 exec/s: 0 rss: 72Mb L: 7/21 MS: 1 InsertByte- 00:08:10.938 [2024-11-17 08:20:23.817266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:10.938 [2024-11-17 08:20:23.817295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.938 #14 NEW cov: 12413 ft: 14003 corp: 6/46b lim: 25 exec/s: 0 rss: 72Mb L: 5/21 MS: 3 EraseBytes-InsertByte-InsertByte- 00:08:10.938 [2024-11-17 08:20:23.877779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:10.938 [2024-11-17 08:20:23.877808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.938 [2024-11-17 08:20:23.877857] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:10.938 [2024-11-17 08:20:23.877873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.938 [2024-11-17 08:20:23.877926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:10.938 [2024-11-17 08:20:23.877942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.938 [2024-11-17 08:20:23.877993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:10.938 [2024-11-17 08:20:23.878008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.938 #15 NEW cov: 12413 ft: 14143 corp: 7/67b lim: 25 exec/s: 0 rss: 72Mb L: 21/21 MS: 1 ChangeByte- 00:08:10.938 [2024-11-17 08:20:23.937934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:10.938 [2024-11-17 08:20:23.937961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.938 [2024-11-17 08:20:23.938029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:10.938 [2024-11-17 08:20:23.938045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.938 [2024-11-17 08:20:23.938098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:10.938 [2024-11-17 08:20:23.938112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.938 [2024-11-17 08:20:23.938165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:10.938 [2024-11-17 08:20:23.938180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.938 #16 NEW cov: 12413 ft: 14250 corp: 8/88b lim: 25 exec/s: 0 rss: 72Mb L: 21/21 MS: 1 ChangeBinInt- 00:08:10.938 [2024-11-17 08:20:23.977685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:10.938 [2024-11-17 08:20:23.977717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.938 #19 NEW cov: 12413 ft: 14352 corp: 9/97b lim: 25 exec/s: 0 rss: 72Mb L: 9/21 MS: 3 ShuffleBytes-ShuffleBytes-CrossOver- 00:08:10.938 [2024-11-17 08:20:24.017794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:10.938 [2024-11-17 08:20:24.017820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.938 #20 NEW cov: 12413 ft: 14444 corp: 10/103b lim: 25 exec/s: 0 rss: 72Mb L: 6/21 MS: 1 ChangeBit- 00:08:10.938 [2024-11-17 08:20:24.058017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:10.938 [2024-11-17 08:20:24.058043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.938 [2024-11-17 08:20:24.058096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:10.938 [2024-11-17 08:20:24.058113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.198 #21 NEW cov: 12413 ft: 14727 corp: 11/113b lim: 25 exec/s: 0 rss: 72Mb L: 10/21 MS: 1 InsertByte- 00:08:11.198 [2024-11-17 08:20:24.118484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.198 [2024-11-17 08:20:24.118510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.198 [2024-11-17 08:20:24.118564] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:11.198 [2024-11-17 08:20:24.118579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.198 [2024-11-17 08:20:24.118631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:11.198 [2024-11-17 08:20:24.118646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.198 [2024-11-17 08:20:24.118701] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:11.198 [2024-11-17 08:20:24.118717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.198 #22 NEW cov: 12413 ft: 14752 corp: 12/134b lim: 25 exec/s: 0 rss: 72Mb L: 21/21 MS: 1 ChangeBinInt- 00:08:11.198 [2024-11-17 08:20:24.178262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.198 [2024-11-17 08:20:24.178289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.198 #23 NEW cov: 12413 ft: 14768 corp: 13/139b lim: 25 exec/s: 0 rss: 73Mb L: 5/21 MS: 1 ChangeBit- 00:08:11.198 [2024-11-17 08:20:24.238765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.198 [2024-11-17 08:20:24.238792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.198 [2024-11-17 08:20:24.238859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:11.198 [2024-11-17 08:20:24.238875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.198 [2024-11-17 08:20:24.238926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:11.198 [2024-11-17 08:20:24.238941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.198 [2024-11-17 08:20:24.238992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:11.198 [2024-11-17 08:20:24.239008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.198 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:11.198 #24 NEW cov: 12436 ft: 14848 corp: 14/160b lim: 25 exec/s: 0 rss: 73Mb L: 21/21 MS: 1 ShuffleBytes- 00:08:11.198 [2024-11-17 08:20:24.278519] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.198 [2024-11-17 08:20:24.278548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.198 #25 NEW cov: 12436 ft: 14892 corp: 15/167b lim: 25 exec/s: 0 rss: 73Mb L: 7/21 MS: 1 CopyPart- 00:08:11.457 [2024-11-17 08:20:24.338710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.458 [2024-11-17 08:20:24.338739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.458 #26 NEW cov: 12436 ft: 14960 corp: 16/173b lim: 25 exec/s: 26 rss: 73Mb L: 6/21 MS: 1 ShuffleBytes- 00:08:11.458 [2024-11-17 08:20:24.378786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.458 [2024-11-17 08:20:24.378813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.458 #27 NEW cov: 12436 ft: 14972 corp: 17/179b lim: 25 exec/s: 27 rss: 73Mb L: 6/21 MS: 1 ChangeBit- 00:08:11.458 [2024-11-17 08:20:24.418923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.458 [2024-11-17 08:20:24.418950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.458 #28 NEW cov: 12436 ft: 15003 corp: 18/185b lim: 25 exec/s: 28 rss: 73Mb L: 6/21 MS: 1 EraseBytes- 00:08:11.458 [2024-11-17 08:20:24.459021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.458 [2024-11-17 08:20:24.459048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.458 #29 NEW cov: 12436 ft: 15041 corp: 19/193b lim: 25 exec/s: 29 rss: 73Mb L: 8/21 MS: 1 CopyPart- 00:08:11.458 [2024-11-17 08:20:24.519172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.458 [2024-11-17 08:20:24.519199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.458 #30 NEW cov: 12436 ft: 15111 corp: 20/198b lim: 25 exec/s: 30 rss: 73Mb L: 5/21 MS: 1 ShuffleBytes- 00:08:11.458 [2024-11-17 08:20:24.559282] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.458 [2024-11-17 08:20:24.559309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.458 #31 NEW cov: 12436 ft: 15119 corp: 21/204b lim: 25 exec/s: 31 rss: 73Mb L: 6/21 MS: 1 CopyPart- 00:08:11.717 [2024-11-17 08:20:24.599424] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.717 [2024-11-17 08:20:24.599453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.717 #32 NEW cov: 12436 ft: 15133 corp: 22/209b lim: 25 exec/s: 32 rss: 73Mb L: 5/21 MS: 1 CMP- DE: "\000\000"- 00:08:11.717 [2024-11-17 08:20:24.639810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.717 [2024-11-17 08:20:24.639836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.717 [2024-11-17 08:20:24.639885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:11.717 [2024-11-17 08:20:24.639901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.718 [2024-11-17 08:20:24.639954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:11.718 [2024-11-17 08:20:24.639969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.718 #33 NEW cov: 12436 ft: 15341 corp: 23/226b lim: 25 exec/s: 33 rss: 73Mb L: 17/21 MS: 1 EraseBytes- 00:08:11.718 [2024-11-17 08:20:24.699657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.718 [2024-11-17 08:20:24.699685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.718 #34 NEW cov: 12436 ft: 15356 corp: 24/231b lim: 25 exec/s: 34 rss: 73Mb L: 5/21 MS: 1 ChangeBit- 00:08:11.718 [2024-11-17 08:20:24.759864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.718 [2024-11-17 08:20:24.759890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.718 #35 NEW cov: 12436 ft: 15367 corp: 25/236b lim: 25 exec/s: 35 rss: 73Mb L: 5/21 MS: 1 ChangeBinInt- 00:08:11.718 [2024-11-17 08:20:24.820020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.718 [2024-11-17 08:20:24.820046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.718 #36 NEW cov: 12436 ft: 15406 corp: 26/242b lim: 25 exec/s: 36 rss: 73Mb L: 6/21 MS: 1 ChangeBit- 00:08:11.977 [2024-11-17 08:20:24.860128] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.977 [2024-11-17 08:20:24.860154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.977 #37 NEW cov: 12436 ft: 15447 corp: 27/248b lim: 25 exec/s: 37 rss: 73Mb L: 6/21 MS: 1 ChangeByte- 00:08:11.977 [2024-11-17 08:20:24.920309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.977 [2024-11-17 08:20:24.920337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.977 #38 NEW cov: 12436 ft: 15451 corp: 28/254b lim: 25 exec/s: 38 rss: 73Mb L: 6/21 MS: 1 ChangeBit- 00:08:11.977 [2024-11-17 08:20:24.980867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.977 [2024-11-17 08:20:24.980893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.977 [2024-11-17 08:20:24.980964] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:11.977 [2024-11-17 08:20:24.980981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.977 [2024-11-17 08:20:24.981034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:11.977 [2024-11-17 08:20:24.981049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.977 [2024-11-17 08:20:24.981103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:11.977 [2024-11-17 08:20:24.981118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.977 #39 NEW cov: 12436 ft: 15463 corp: 29/276b lim: 25 exec/s: 39 rss: 74Mb L: 22/22 MS: 1 InsertByte- 00:08:11.977 [2024-11-17 08:20:25.040773] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.977 [2024-11-17 08:20:25.040799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.977 [2024-11-17 08:20:25.040852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:11.977 [2024-11-17 08:20:25.040868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.977 #40 NEW cov: 12436 ft: 15473 corp: 30/286b lim: 25 exec/s: 40 rss: 74Mb L: 10/22 MS: 1 ChangeBinInt- 00:08:11.977 [2024-11-17 08:20:25.100975] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:11.977 [2024-11-17 08:20:25.101003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.977 [2024-11-17 08:20:25.101052] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:11.977 [2024-11-17 08:20:25.101071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.237 #41 NEW cov: 12436 ft: 15483 corp: 31/297b lim: 25 exec/s: 41 rss: 74Mb L: 11/22 MS: 1 InsertByte- 00:08:12.237 [2024-11-17 08:20:25.140927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:12.237 [2024-11-17 08:20:25.140954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.237 #42 NEW cov: 12436 ft: 15502 corp: 32/302b lim: 25 exec/s: 42 rss: 74Mb L: 5/22 MS: 1 ChangeByte- 00:08:12.237 [2024-11-17 08:20:25.201266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:12.237 [2024-11-17 08:20:25.201291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.237 #43 NEW cov: 12436 ft: 15515 corp: 33/309b lim: 25 exec/s: 43 rss: 74Mb L: 7/22 MS: 1 InsertByte- 00:08:12.237 [2024-11-17 08:20:25.241501] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:12.237 [2024-11-17 08:20:25.241527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.237 [2024-11-17 08:20:25.241582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:12.237 [2024-11-17 08:20:25.241598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.237 #44 NEW cov: 12436 ft: 15518 corp: 34/321b lim: 25 exec/s: 44 rss: 74Mb L: 12/22 MS: 1 EraseBytes- 00:08:12.237 [2024-11-17 08:20:25.301503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:12.237 [2024-11-17 08:20:25.301528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.237 #46 NEW cov: 12436 ft: 15540 corp: 35/326b lim: 25 exec/s: 23 rss: 74Mb L: 5/22 MS: 2 EraseBytes-PersAutoDict- DE: "\000\000"- 00:08:12.237 #46 DONE cov: 12436 ft: 15540 corp: 35/326b lim: 25 exec/s: 23 rss: 74Mb 00:08:12.237 ###### Recommended dictionary. ###### 00:08:12.237 "\000\000" # Uses: 1 00:08:12.237 ###### End of recommended dictionary. ###### 00:08:12.237 Done 46 runs in 2 second(s) 00:08:12.497 08:20:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_23.conf /var/tmp/suppress_nvmf_fuzz 00:08:12.497 08:20:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:12.497 08:20:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:12.497 08:20:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:12.497 08:20:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:12.497 08:20:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:12.497 08:20:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:12.497 08:20:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:12.497 08:20:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:12.497 08:20:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:12.497 08:20:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:12.497 08:20:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 24 00:08:12.497 08:20:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4424 00:08:12.497 08:20:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:12.497 08:20:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:12.497 08:20:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:12.497 08:20:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:12.497 08:20:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:12.497 08:20:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 00:08:12.497 [2024-11-17 08:20:25.494471] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:12.497 [2024-11-17 08:20:25.494539] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1000004 ] 00:08:12.756 [2024-11-17 08:20:25.673318] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:12.757 [2024-11-17 08:20:25.695352] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.757 [2024-11-17 08:20:25.747612] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:12.757 [2024-11-17 08:20:25.763953] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:12.757 INFO: Running with entropic power schedule (0xFF, 100). 00:08:12.757 INFO: Seed: 4037912332 00:08:12.757 INFO: Loaded 1 modules (384274 inline 8-bit counters): 384274 [0x2a7848c, 0x2ad619e), 00:08:12.757 INFO: Loaded 1 PC tables (384274 PCs): 384274 [0x2ad61a0,0x30b32c0), 00:08:12.757 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:12.757 INFO: A corpus is not provided, starting from an empty corpus 00:08:12.757 #2 INITED exec/s: 0 rss: 65Mb 00:08:12.757 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:12.757 This may also happen if the target rejected all inputs we tried so far 00:08:12.757 [2024-11-17 08:20:25.830738] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.757 [2024-11-17 08:20:25.830795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.757 [2024-11-17 08:20:25.830920] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.757 [2024-11-17 08:20:25.830943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.757 [2024-11-17 08:20:25.831071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.757 [2024-11-17 08:20:25.831096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.757 [2024-11-17 08:20:25.831219] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.757 [2024-11-17 08:20:25.831243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.016 NEW_FUNC[1/716]: 0x4851a8 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:13.016 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:13.016 #7 NEW cov: 12281 ft: 12282 corp: 2/97b lim: 100 exec/s: 0 rss: 72Mb L: 96/96 MS: 5 CopyPart-ChangeByte-CMP-CrossOver-InsertRepeatedBytes- DE: "\371lw=\233\226\212\000"- 00:08:13.279 [2024-11-17 08:20:26.181797] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.280 [2024-11-17 08:20:26.181852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.280 [2024-11-17 08:20:26.181983] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.280 [2024-11-17 08:20:26.182013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.280 [2024-11-17 08:20:26.182145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.280 [2024-11-17 08:20:26.182175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.280 [2024-11-17 08:20:26.182302] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433263848 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.280 [2024-11-17 08:20:26.182333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.280 #13 NEW cov: 12394 ft: 12910 corp: 3/193b lim: 100 exec/s: 0 rss: 72Mb L: 96/96 MS: 1 ChangeBit- 00:08:13.280 [2024-11-17 08:20:26.251934] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.280 [2024-11-17 08:20:26.251970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.280 [2024-11-17 08:20:26.252095] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.280 [2024-11-17 08:20:26.252122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.280 [2024-11-17 08:20:26.252243] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.280 [2024-11-17 08:20:26.252267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.280 [2024-11-17 08:20:26.252393] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.280 [2024-11-17 08:20:26.252417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.280 #14 NEW cov: 12400 ft: 13166 corp: 4/289b lim: 100 exec/s: 0 rss: 72Mb L: 96/96 MS: 1 CopyPart- 00:08:13.280 [2024-11-17 08:20:26.302119] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.280 [2024-11-17 08:20:26.302154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.280 [2024-11-17 08:20:26.302247] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.280 [2024-11-17 08:20:26.302273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.280 [2024-11-17 08:20:26.302386] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.281 [2024-11-17 08:20:26.302414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.281 [2024-11-17 08:20:26.302534] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.281 [2024-11-17 08:20:26.302559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.281 #15 NEW cov: 12485 ft: 13396 corp: 5/385b lim: 100 exec/s: 0 rss: 72Mb L: 96/96 MS: 1 ChangeBinInt- 00:08:13.281 [2024-11-17 08:20:26.352235] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.281 [2024-11-17 08:20:26.352266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.281 [2024-11-17 08:20:26.352342] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.281 [2024-11-17 08:20:26.352365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.281 [2024-11-17 08:20:26.352476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.281 [2024-11-17 08:20:26.352501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.281 [2024-11-17 08:20:26.352619] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782713390247241960 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.281 [2024-11-17 08:20:26.352641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.281 #16 NEW cov: 12485 ft: 13503 corp: 6/482b lim: 100 exec/s: 0 rss: 72Mb L: 97/97 MS: 1 InsertByte- 00:08:13.542 [2024-11-17 08:20:26.422473] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.422504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.542 [2024-11-17 08:20:26.422587] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59568 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.422611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.542 [2024-11-17 08:20:26.422728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.422748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.542 [2024-11-17 08:20:26.422867] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.422889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.542 #17 NEW cov: 12485 ft: 13560 corp: 7/579b lim: 100 exec/s: 0 rss: 72Mb L: 97/97 MS: 1 InsertByte- 00:08:13.542 [2024-11-17 08:20:26.492692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.492726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.542 [2024-11-17 08:20:26.492826] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.492850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.542 [2024-11-17 08:20:26.492972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.492994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.542 [2024-11-17 08:20:26.493109] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.493129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.542 #18 NEW cov: 12485 ft: 13607 corp: 8/675b lim: 100 exec/s: 0 rss: 72Mb L: 96/97 MS: 1 ChangeByte- 00:08:13.542 [2024-11-17 08:20:26.542946] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.542976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.542 [2024-11-17 08:20:26.543055] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.543079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.542 [2024-11-17 08:20:26.543192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.543211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.542 [2024-11-17 08:20:26.543327] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.543342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.542 #19 NEW cov: 12485 ft: 13658 corp: 9/769b lim: 100 exec/s: 0 rss: 72Mb L: 94/97 MS: 1 EraseBytes- 00:08:13.542 [2024-11-17 08:20:26.592953] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920094709246184 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.592983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.542 [2024-11-17 08:20:26.593057] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.593083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.542 [2024-11-17 08:20:26.593204] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.593231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.542 [2024-11-17 08:20:26.593359] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.593385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.542 #20 NEW cov: 12485 ft: 13705 corp: 10/866b lim: 100 exec/s: 0 rss: 72Mb L: 97/97 MS: 1 CrossOver- 00:08:13.542 [2024-11-17 08:20:26.643269] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.643306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.542 [2024-11-17 08:20:26.643417] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.643442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.542 [2024-11-17 08:20:26.643553] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.643577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.542 [2024-11-17 08:20:26.643698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433263848 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.542 [2024-11-17 08:20:26.643719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.542 #21 NEW cov: 12485 ft: 13769 corp: 11/962b lim: 100 exec/s: 0 rss: 72Mb L: 96/97 MS: 1 ChangeBit- 00:08:13.802 [2024-11-17 08:20:26.693371] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.802 [2024-11-17 08:20:26.693408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.802 [2024-11-17 08:20:26.693499] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.802 [2024-11-17 08:20:26.693522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.802 [2024-11-17 08:20:26.693637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.802 [2024-11-17 08:20:26.693662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.802 [2024-11-17 08:20:26.693777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.802 [2024-11-17 08:20:26.693801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.802 NEW_FUNC[1/1]: 0x1c1ae08 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:13.802 #22 NEW cov: 12508 ft: 13829 corp: 12/1056b lim: 100 exec/s: 0 rss: 72Mb L: 94/97 MS: 1 ShuffleBytes- 00:08:13.802 [2024-11-17 08:20:26.763434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.802 [2024-11-17 08:20:26.763465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.802 [2024-11-17 08:20:26.763563] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.802 [2024-11-17 08:20:26.763584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.802 [2024-11-17 08:20:26.763701] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.802 [2024-11-17 08:20:26.763722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.802 [2024-11-17 08:20:26.763839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.802 [2024-11-17 08:20:26.763865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.802 #28 NEW cov: 12508 ft: 13868 corp: 13/1152b lim: 100 exec/s: 0 rss: 72Mb L: 96/97 MS: 1 CrossOver- 00:08:13.802 [2024-11-17 08:20:26.813720] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.802 [2024-11-17 08:20:26.813749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.802 [2024-11-17 08:20:26.813870] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.802 [2024-11-17 08:20:26.813892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.802 [2024-11-17 08:20:26.814008] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.802 [2024-11-17 08:20:26.814032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.802 [2024-11-17 08:20:26.814154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.802 [2024-11-17 08:20:26.814178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.802 #29 NEW cov: 12508 ft: 13896 corp: 14/1250b lim: 100 exec/s: 29 rss: 72Mb L: 98/98 MS: 1 InsertRepeatedBytes- 00:08:13.802 [2024-11-17 08:20:26.863853] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920094709246184 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.802 [2024-11-17 08:20:26.863889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.802 [2024-11-17 08:20:26.863982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.802 [2024-11-17 08:20:26.864007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.802 [2024-11-17 08:20:26.864122] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.802 [2024-11-17 08:20:26.864143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.802 [2024-11-17 08:20:26.864253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.802 [2024-11-17 08:20:26.864276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.803 #30 NEW cov: 12508 ft: 13916 corp: 15/1334b lim: 100 exec/s: 30 rss: 73Mb L: 84/98 MS: 1 EraseBytes- 00:08:13.803 [2024-11-17 08:20:26.934144] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.803 [2024-11-17 08:20:26.934179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.803 [2024-11-17 08:20:26.934277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.803 [2024-11-17 08:20:26.934300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.803 [2024-11-17 08:20:26.934417] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.803 [2024-11-17 08:20:26.934441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.803 [2024-11-17 08:20:26.934568] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:11211300059174696765 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.803 [2024-11-17 08:20:26.934593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.062 #31 NEW cov: 12508 ft: 13920 corp: 16/1430b lim: 100 exec/s: 31 rss: 73Mb L: 96/98 MS: 1 PersAutoDict- DE: "\371lw=\233\226\212\000"- 00:08:14.062 [2024-11-17 08:20:26.984303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.062 [2024-11-17 08:20:26.984338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.062 [2024-11-17 08:20:26.984447] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.062 [2024-11-17 08:20:26.984469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.062 [2024-11-17 08:20:26.984580] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.062 [2024-11-17 08:20:26.984607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.062 [2024-11-17 08:20:26.984726] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.062 [2024-11-17 08:20:26.984750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.062 #32 NEW cov: 12508 ft: 14026 corp: 17/1529b lim: 100 exec/s: 32 rss: 73Mb L: 99/99 MS: 1 InsertByte- 00:08:14.062 [2024-11-17 08:20:27.054441] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.062 [2024-11-17 08:20:27.054472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.062 [2024-11-17 08:20:27.054570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.062 [2024-11-17 08:20:27.054594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.062 [2024-11-17 08:20:27.054720] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.062 [2024-11-17 08:20:27.054742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.062 [2024-11-17 08:20:27.054873] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.062 [2024-11-17 08:20:27.054898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.062 #33 NEW cov: 12508 ft: 14037 corp: 18/1623b lim: 100 exec/s: 33 rss: 73Mb L: 94/99 MS: 1 ChangeBinInt- 00:08:14.063 [2024-11-17 08:20:27.104599] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.063 [2024-11-17 08:20:27.104629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.063 [2024-11-17 08:20:27.104728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.063 [2024-11-17 08:20:27.104752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.063 [2024-11-17 08:20:27.104874] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.063 [2024-11-17 08:20:27.104897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.063 [2024-11-17 08:20:27.105023] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:11211300059174696765 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.063 [2024-11-17 08:20:27.105046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.063 #34 NEW cov: 12508 ft: 14053 corp: 19/1719b lim: 100 exec/s: 34 rss: 73Mb L: 96/99 MS: 1 ShuffleBytes- 00:08:14.063 [2024-11-17 08:20:27.174882] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.063 [2024-11-17 08:20:27.174913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.063 [2024-11-17 08:20:27.174976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.063 [2024-11-17 08:20:27.174999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.063 [2024-11-17 08:20:27.175119] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920097051836648 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.063 [2024-11-17 08:20:27.175147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.063 [2024-11-17 08:20:27.175273] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.063 [2024-11-17 08:20:27.175297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.063 #35 NEW cov: 12508 ft: 14070 corp: 20/1815b lim: 100 exec/s: 35 rss: 73Mb L: 96/99 MS: 1 PersAutoDict- DE: "\371lw=\233\226\212\000"- 00:08:14.322 [2024-11-17 08:20:27.225055] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.322 [2024-11-17 08:20:27.225085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.322 [2024-11-17 08:20:27.225166] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59568 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.322 [2024-11-17 08:20:27.225188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.322 [2024-11-17 08:20:27.225302] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.322 [2024-11-17 08:20:27.225325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.322 [2024-11-17 08:20:27.225449] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782938257555515624 len:39831 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.322 [2024-11-17 08:20:27.225468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.322 #36 NEW cov: 12508 ft: 14109 corp: 21/1912b lim: 100 exec/s: 36 rss: 73Mb L: 97/99 MS: 1 PersAutoDict- DE: "\371lw=\233\226\212\000"- 00:08:14.322 [2024-11-17 08:20:27.295345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.322 [2024-11-17 08:20:27.295383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.322 [2024-11-17 08:20:27.295496] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.322 [2024-11-17 08:20:27.295517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.322 [2024-11-17 08:20:27.295632] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.322 [2024-11-17 08:20:27.295654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.323 [2024-11-17 08:20:27.295782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:11196077320688138045 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.323 [2024-11-17 08:20:27.295806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.323 #37 NEW cov: 12508 ft: 14120 corp: 22/2008b lim: 100 exec/s: 37 rss: 73Mb L: 96/99 MS: 1 ChangeBinInt- 00:08:14.323 [2024-11-17 08:20:27.365500] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.323 [2024-11-17 08:20:27.365532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.323 [2024-11-17 08:20:27.365599] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.323 [2024-11-17 08:20:27.365623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.323 [2024-11-17 08:20:27.365762] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.323 [2024-11-17 08:20:27.365787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.323 [2024-11-17 08:20:27.365906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.323 [2024-11-17 08:20:27.365930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.323 #38 NEW cov: 12508 ft: 14187 corp: 23/2104b lim: 100 exec/s: 38 rss: 73Mb L: 96/99 MS: 1 ShuffleBytes- 00:08:14.323 [2024-11-17 08:20:27.415704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.323 [2024-11-17 08:20:27.415736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.323 [2024-11-17 08:20:27.415815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.323 [2024-11-17 08:20:27.415849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.323 [2024-11-17 08:20:27.415966] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.323 [2024-11-17 08:20:27.415990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.323 [2024-11-17 08:20:27.416104] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.323 [2024-11-17 08:20:27.416131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.323 #39 NEW cov: 12508 ft: 14200 corp: 24/2203b lim: 100 exec/s: 39 rss: 73Mb L: 99/99 MS: 1 InsertRepeatedBytes- 00:08:14.583 [2024-11-17 08:20:27.465852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.465888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.583 [2024-11-17 08:20:27.465979] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782884914061699304 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.466003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.583 [2024-11-17 08:20:27.466116] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.466139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.583 [2024-11-17 08:20:27.466253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.466279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.583 #40 NEW cov: 12508 ft: 14237 corp: 25/2297b lim: 100 exec/s: 40 rss: 73Mb L: 94/99 MS: 1 ChangeBit- 00:08:14.583 [2024-11-17 08:20:27.536013] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.536044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.583 [2024-11-17 08:20:27.536124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.536144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.583 [2024-11-17 08:20:27.536260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.536285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.583 [2024-11-17 08:20:27.536395] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:11211300059174696765 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.536418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.583 #41 NEW cov: 12508 ft: 14245 corp: 26/2393b lim: 100 exec/s: 41 rss: 73Mb L: 96/99 MS: 1 ChangeByte- 00:08:14.583 [2024-11-17 08:20:27.586347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.586386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.583 [2024-11-17 08:20:27.586501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.586529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.583 [2024-11-17 08:20:27.586646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16735350827369687272 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.586674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.583 [2024-11-17 08:20:27.586794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098299570408 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.586820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.583 [2024-11-17 08:20:27.586936] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:16773649583324063976 len:2716 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.586959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:14.583 #42 NEW cov: 12508 ft: 14308 corp: 27/2493b lim: 100 exec/s: 42 rss: 73Mb L: 100/100 MS: 1 InsertByte- 00:08:14.583 [2024-11-17 08:20:27.656338] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.656371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.583 [2024-11-17 08:20:27.656470] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.656494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.583 [2024-11-17 08:20:27.656605] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.656625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.583 [2024-11-17 08:20:27.656743] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.656768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.583 #43 NEW cov: 12508 ft: 14383 corp: 28/2587b lim: 100 exec/s: 43 rss: 73Mb L: 94/100 MS: 1 ChangeByte- 00:08:14.583 [2024-11-17 08:20:27.706460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.706493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.583 [2024-11-17 08:20:27.706577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:416224438504 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.706599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.583 [2024-11-17 08:20:27.706711] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.706732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.583 [2024-11-17 08:20:27.706846] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.583 [2024-11-17 08:20:27.706868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.843 #44 NEW cov: 12508 ft: 14398 corp: 29/2683b lim: 100 exec/s: 44 rss: 73Mb L: 96/100 MS: 1 ChangeBinInt- 00:08:14.843 [2024-11-17 08:20:27.756582] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.843 [2024-11-17 08:20:27.756614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.843 [2024-11-17 08:20:27.756699] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920197218035944 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.843 [2024-11-17 08:20:27.756723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.843 [2024-11-17 08:20:27.756840] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.843 [2024-11-17 08:20:27.756866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.843 [2024-11-17 08:20:27.756991] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.843 [2024-11-17 08:20:27.757018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.843 #45 NEW cov: 12508 ft: 14407 corp: 30/2779b lim: 100 exec/s: 45 rss: 73Mb L: 96/100 MS: 1 CMP- DE: "\377\377"- 00:08:14.843 [2024-11-17 08:20:27.826849] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.843 [2024-11-17 08:20:27.826888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.843 [2024-11-17 08:20:27.826985] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.843 [2024-11-17 08:20:27.827009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.843 [2024-11-17 08:20:27.827126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.843 [2024-11-17 08:20:27.827148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.843 [2024-11-17 08:20:27.827263] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.843 [2024-11-17 08:20:27.827288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.843 #46 NEW cov: 12508 ft: 14413 corp: 31/2873b lim: 100 exec/s: 23 rss: 73Mb L: 94/100 MS: 1 ChangeBit- 00:08:14.843 #46 DONE cov: 12508 ft: 14413 corp: 31/2873b lim: 100 exec/s: 23 rss: 73Mb 00:08:14.843 ###### Recommended dictionary. ###### 00:08:14.843 "\371lw=\233\226\212\000" # Uses: 3 00:08:14.843 "\377\377" # Uses: 0 00:08:14.843 ###### End of recommended dictionary. ###### 00:08:14.843 Done 46 runs in 2 second(s) 00:08:14.843 08:20:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_24.conf /var/tmp/suppress_nvmf_fuzz 00:08:14.843 08:20:27 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:14.843 08:20:27 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:14.843 08:20:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@79 -- # trap - SIGINT SIGTERM EXIT 00:08:14.843 00:08:14.843 real 1m2.469s 00:08:14.843 user 1m39.178s 00:08:14.843 sys 0m6.988s 00:08:14.843 08:20:27 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:14.843 08:20:27 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:14.843 ************************************ 00:08:14.843 END TEST nvmf_llvm_fuzz 00:08:14.843 ************************************ 00:08:15.104 08:20:28 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:15.104 08:20:28 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:15.104 08:20:28 llvm_fuzz -- fuzz/llvm.sh@20 -- # run_test vfio_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:15.104 08:20:28 llvm_fuzz -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:15.104 08:20:28 llvm_fuzz -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:15.104 08:20:28 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:15.104 ************************************ 00:08:15.104 START TEST vfio_llvm_fuzz 00:08:15.104 ************************************ 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:15.104 * Looking for test storage... 00:08:15.104 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:08:15.104 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:15.104 --rc genhtml_branch_coverage=1 00:08:15.104 --rc genhtml_function_coverage=1 00:08:15.104 --rc genhtml_legend=1 00:08:15.104 --rc geninfo_all_blocks=1 00:08:15.104 --rc geninfo_unexecuted_blocks=1 00:08:15.104 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:15.104 ' 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:08:15.104 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:15.104 --rc genhtml_branch_coverage=1 00:08:15.104 --rc genhtml_function_coverage=1 00:08:15.104 --rc genhtml_legend=1 00:08:15.104 --rc geninfo_all_blocks=1 00:08:15.104 --rc geninfo_unexecuted_blocks=1 00:08:15.104 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:15.104 ' 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:08:15.104 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:15.104 --rc genhtml_branch_coverage=1 00:08:15.104 --rc genhtml_function_coverage=1 00:08:15.104 --rc genhtml_legend=1 00:08:15.104 --rc geninfo_all_blocks=1 00:08:15.104 --rc geninfo_unexecuted_blocks=1 00:08:15.104 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:15.104 ' 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:08:15.104 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:15.104 --rc genhtml_branch_coverage=1 00:08:15.104 --rc genhtml_function_coverage=1 00:08:15.104 --rc genhtml_legend=1 00:08:15.104 --rc geninfo_all_blocks=1 00:08:15.104 --rc geninfo_unexecuted_blocks=1 00:08:15.104 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:15.104 ' 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@64 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:15.104 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_AIO_FSDEV=y 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_UBLK=y 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_ISAL_CRYPTO=y 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_OPENSSL_PATH= 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OCF=n 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_FUSE=n 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_VTUNE_DIR= 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER=y 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FSDEV=y 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_CRYPTO=n 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_PGO_USE=n 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_VHOST=y 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_DAOS=n 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:15.105 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DAOS_DIR= 00:08:15.367 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_UNIT_TESTS=n 00:08:15.367 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:15.367 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_VIRTIO=y 00:08:15.367 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_DPDK_UADK=n 00:08:15.367 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_COVERAGE=y 00:08:15.367 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_RDMA=y 00:08:15.367 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:08:15.367 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_LZ4=n 00:08:15.367 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:15.367 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_URING_PATH= 00:08:15.367 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_XNVME=n 00:08:15.367 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_VFIO_USER=y 00:08:15.367 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_ARCH=native 00:08:15.367 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_HAVE_EVP_MAC=y 00:08:15.367 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_URING_ZNS=n 00:08:15.367 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_WERROR=y 00:08:15.367 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_HAVE_LIBBSD=n 00:08:15.367 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_UBSAN=y 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_IPSEC_MB_DIR= 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_GOLANG=n 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_ISAL=y 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_IDXD_KERNEL=y 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_RDMA_PROV=verbs 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_APPS=y 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_SHARED=n 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_HAVE_KEYUTILS=y 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_FC_PATH= 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_FC=n 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_AVAHI=n 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_FIO_PLUGIN=y 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_RAID5F=n 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_EXAMPLES=y 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_TESTS=y 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_CRYPTO_MLX5=n 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_MAX_LCORES=128 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_IPSEC_MB=n 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_PGO_DIR= 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_DEBUG=y 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_CROSS_PREFIX= 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_COPY_FILE_RANGE=y 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_URING=n 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:15.368 #define SPDK_CONFIG_H 00:08:15.368 #define SPDK_CONFIG_AIO_FSDEV 1 00:08:15.368 #define SPDK_CONFIG_APPS 1 00:08:15.368 #define SPDK_CONFIG_ARCH native 00:08:15.368 #undef SPDK_CONFIG_ASAN 00:08:15.368 #undef SPDK_CONFIG_AVAHI 00:08:15.368 #undef SPDK_CONFIG_CET 00:08:15.368 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:08:15.368 #define SPDK_CONFIG_COVERAGE 1 00:08:15.368 #define SPDK_CONFIG_CROSS_PREFIX 00:08:15.368 #undef SPDK_CONFIG_CRYPTO 00:08:15.368 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:15.368 #undef SPDK_CONFIG_CUSTOMOCF 00:08:15.368 #undef SPDK_CONFIG_DAOS 00:08:15.368 #define SPDK_CONFIG_DAOS_DIR 00:08:15.368 #define SPDK_CONFIG_DEBUG 1 00:08:15.368 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:15.368 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:15.368 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:15.368 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:15.368 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:15.368 #undef SPDK_CONFIG_DPDK_UADK 00:08:15.368 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:15.368 #define SPDK_CONFIG_EXAMPLES 1 00:08:15.368 #undef SPDK_CONFIG_FC 00:08:15.368 #define SPDK_CONFIG_FC_PATH 00:08:15.368 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:15.368 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:15.368 #define SPDK_CONFIG_FSDEV 1 00:08:15.368 #undef SPDK_CONFIG_FUSE 00:08:15.368 #define SPDK_CONFIG_FUZZER 1 00:08:15.368 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:15.368 #undef SPDK_CONFIG_GOLANG 00:08:15.368 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:15.368 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:08:15.368 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:15.368 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:08:15.368 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:15.368 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:15.368 #undef SPDK_CONFIG_HAVE_LZ4 00:08:15.368 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:08:15.368 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:08:15.368 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:15.368 #define SPDK_CONFIG_IDXD 1 00:08:15.368 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:15.368 #undef SPDK_CONFIG_IPSEC_MB 00:08:15.368 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:15.368 #define SPDK_CONFIG_ISAL 1 00:08:15.368 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:15.368 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:15.368 #define SPDK_CONFIG_LIBDIR 00:08:15.368 #undef SPDK_CONFIG_LTO 00:08:15.368 #define SPDK_CONFIG_MAX_LCORES 128 00:08:15.368 #define SPDK_CONFIG_NVME_CUSE 1 00:08:15.368 #undef SPDK_CONFIG_OCF 00:08:15.368 #define SPDK_CONFIG_OCF_PATH 00:08:15.368 #define SPDK_CONFIG_OPENSSL_PATH 00:08:15.368 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:15.368 #define SPDK_CONFIG_PGO_DIR 00:08:15.368 #undef SPDK_CONFIG_PGO_USE 00:08:15.368 #define SPDK_CONFIG_PREFIX /usr/local 00:08:15.368 #undef SPDK_CONFIG_RAID5F 00:08:15.368 #undef SPDK_CONFIG_RBD 00:08:15.368 #define SPDK_CONFIG_RDMA 1 00:08:15.368 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:15.368 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:15.368 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:15.368 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:15.368 #undef SPDK_CONFIG_SHARED 00:08:15.368 #undef SPDK_CONFIG_SMA 00:08:15.368 #define SPDK_CONFIG_TESTS 1 00:08:15.368 #undef SPDK_CONFIG_TSAN 00:08:15.368 #define SPDK_CONFIG_UBLK 1 00:08:15.368 #define SPDK_CONFIG_UBSAN 1 00:08:15.368 #undef SPDK_CONFIG_UNIT_TESTS 00:08:15.368 #undef SPDK_CONFIG_URING 00:08:15.368 #define SPDK_CONFIG_URING_PATH 00:08:15.368 #undef SPDK_CONFIG_URING_ZNS 00:08:15.368 #undef SPDK_CONFIG_USDT 00:08:15.368 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:15.368 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:15.368 #define SPDK_CONFIG_VFIO_USER 1 00:08:15.368 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:15.368 #define SPDK_CONFIG_VHOST 1 00:08:15.368 #define SPDK_CONFIG_VIRTIO 1 00:08:15.368 #undef SPDK_CONFIG_VTUNE 00:08:15.368 #define SPDK_CONFIG_VTUNE_DIR 00:08:15.368 #define SPDK_CONFIG_WERROR 1 00:08:15.368 #define SPDK_CONFIG_WPDK_DIR 00:08:15.368 #undef SPDK_CONFIG_XNVME 00:08:15.368 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:15.368 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # uname -s 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@140 -- # : v23.11 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:08:15.369 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@179 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@179 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@180 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@180 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@185 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@185 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@189 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@189 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@193 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@193 -- # PYTHONDONTWRITEBYTECODE=1 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@197 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@197 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@198 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@198 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@202 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@203 -- # rm -rf /var/tmp/asan_suppression_file 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@204 -- # cat 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@240 -- # echo leak:libfuse3.so 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@242 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@242 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # '[' -z /var/spdk/dependencies ']' 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@249 -- # export DEPENDENCY_DIR 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@253 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@253 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@254 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@254 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@257 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@257 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@258 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@258 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@263 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@263 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # _LCOV_MAIN=0 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@266 -- # _LCOV_LLVM=1 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV= 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ '' == *clang* ]] 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ 1 -eq 1 ]] 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV=1 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@271 -- # _lcov_opt[_LCOV_MAIN]= 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@273 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@276 -- # '[' 0 -eq 0 ']' 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@277 -- # export valgrind= 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@277 -- # valgrind= 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@283 -- # uname -s 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@283 -- # '[' Linux = Linux ']' 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@284 -- # HUGEMEM=4096 00:08:15.370 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # export CLEAR_HUGE=yes 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # CLEAR_HUGE=yes 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # MAKE=make 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@288 -- # MAKEFLAGS=-j112 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@304 -- # export HUGEMEM=4096 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@304 -- # HUGEMEM=4096 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # NO_HUGE=() 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@307 -- # TEST_MODE= 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@329 -- # [[ -z 1000569 ]] 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@329 -- # kill -0 1000569 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1666 -- # set_test_storage 2147483648 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@339 -- # [[ -v testdir ]] 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@341 -- # local requested_size=2147483648 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@342 -- # local mount target_dir 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@344 -- # local -A mounts fss sizes avails uses 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@345 -- # local source fs size avail mount use 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@347 -- # local storage_fallback storage_candidates 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@349 -- # mktemp -udt spdk.XXXXXX 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@349 -- # storage_fallback=/tmp/spdk.9XNEPC 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@354 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@356 -- # [[ -n '' ]] 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@361 -- # [[ -n '' ]] 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@366 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.9XNEPC/tests/vfio /tmp/spdk.9XNEPC 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@369 -- # requested_size=2214592512 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@338 -- # df -T 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@338 -- # grep -v Filesystem 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_devtmpfs 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=devtmpfs 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=67108864 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=67108864 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=0 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=/dev/pmem0 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=ext2 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=4096 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=5284429824 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=5284425728 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_root 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=overlay 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=52386893824 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=61730594816 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=9343700992 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30861869056 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865297408 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=3428352 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=12340121600 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=12346122240 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=6000640 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30865092608 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865297408 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=204800 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=6173044736 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=6173057024 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=12288 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@377 -- # printf '* Looking for test storage...\n' 00:08:15.371 * Looking for test storage... 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@379 -- # local target_space new_size 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@380 -- # for target_dir in "${storage_candidates[@]}" 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@383 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@383 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@383 -- # mount=/ 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # target_space=52386893824 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@386 -- # (( target_space == 0 || target_space < requested_size )) 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@389 -- # (( target_space >= requested_size )) 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == tmpfs ]] 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == ramfs ]] 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ / == / ]] 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@392 -- # new_size=11558293504 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@398 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@398 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@399 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:15.371 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # return 0 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1668 -- # set -o errtrace 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1669 -- # shopt -s extdebug 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1670 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1672 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1673 -- # true 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1675 -- # xtrace_fd 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:15.371 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:08:15.372 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:15.372 --rc genhtml_branch_coverage=1 00:08:15.372 --rc genhtml_function_coverage=1 00:08:15.372 --rc genhtml_legend=1 00:08:15.372 --rc geninfo_all_blocks=1 00:08:15.372 --rc geninfo_unexecuted_blocks=1 00:08:15.372 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:15.372 ' 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:08:15.372 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:15.372 --rc genhtml_branch_coverage=1 00:08:15.372 --rc genhtml_function_coverage=1 00:08:15.372 --rc genhtml_legend=1 00:08:15.372 --rc geninfo_all_blocks=1 00:08:15.372 --rc geninfo_unexecuted_blocks=1 00:08:15.372 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:15.372 ' 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:08:15.372 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:15.372 --rc genhtml_branch_coverage=1 00:08:15.372 --rc genhtml_function_coverage=1 00:08:15.372 --rc genhtml_legend=1 00:08:15.372 --rc geninfo_all_blocks=1 00:08:15.372 --rc geninfo_unexecuted_blocks=1 00:08:15.372 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:15.372 ' 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:08:15.372 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:15.372 --rc genhtml_branch_coverage=1 00:08:15.372 --rc genhtml_function_coverage=1 00:08:15.372 --rc genhtml_legend=1 00:08:15.372 --rc geninfo_all_blocks=1 00:08:15.372 --rc geninfo_unexecuted_blocks=1 00:08:15.372 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:15.372 ' 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@65 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@67 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # fuzz_num=7 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@69 -- # (( fuzz_num != 0 )) 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@71 -- # trap 'cleanup /tmp/vfio-user-* /var/tmp/suppress_vfio_fuzz; exit 1' SIGINT SIGTERM EXIT 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@74 -- # mem_size=0 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@75 -- # [[ 1 -eq 1 ]] 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@76 -- # start_llvm_fuzz_short 7 1 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=7 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:15.372 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:15.632 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:15.632 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:15.632 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:15.632 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:15.632 08:20:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:15.632 [2024-11-17 08:20:28.535806] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:15.632 [2024-11-17 08:20:28.535877] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1000632 ] 00:08:15.632 [2024-11-17 08:20:28.605314] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.632 [2024-11-17 08:20:28.643442] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.892 INFO: Running with entropic power schedule (0xFF, 100). 00:08:15.892 INFO: Seed: 2782957162 00:08:15.892 INFO: Loaded 1 modules (381510 inline 8-bit counters): 381510 [0x2a38ccc, 0x2a95f12), 00:08:15.892 INFO: Loaded 1 PC tables (381510 PCs): 381510 [0x2a95f18,0x3068378), 00:08:15.892 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:15.892 INFO: A corpus is not provided, starting from an empty corpus 00:08:15.892 #2 INITED exec/s: 0 rss: 66Mb 00:08:15.892 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:15.892 This may also happen if the target rejected all inputs we tried so far 00:08:15.892 [2024-11-17 08:20:28.870295] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: enabling controller 00:08:16.411 NEW_FUNC[1/667]: 0x459068 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:84 00:08:16.411 NEW_FUNC[2/667]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:16.411 #15 NEW cov: 11069 ft: 11054 corp: 2/7b lim: 6 exec/s: 0 rss: 72Mb L: 6/6 MS: 3 ChangeByte-CrossOver-InsertRepeatedBytes- 00:08:16.411 NEW_FUNC[1/1]: 0x13167d8 in nvmf_ctrlr_ns_is_visible /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/./nvmf_internal.h:494 00:08:16.411 #16 NEW cov: 11097 ft: 14280 corp: 3/13b lim: 6 exec/s: 0 rss: 73Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:16.669 NEW_FUNC[1/1]: 0x1be7258 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:16.669 #17 NEW cov: 11117 ft: 15672 corp: 4/19b lim: 6 exec/s: 0 rss: 74Mb L: 6/6 MS: 1 ChangeByte- 00:08:16.927 #18 NEW cov: 11117 ft: 16075 corp: 5/25b lim: 6 exec/s: 18 rss: 74Mb L: 6/6 MS: 1 CopyPart- 00:08:17.186 #19 NEW cov: 11117 ft: 17017 corp: 6/31b lim: 6 exec/s: 19 rss: 74Mb L: 6/6 MS: 1 CopyPart- 00:08:17.186 #20 NEW cov: 11117 ft: 17228 corp: 7/37b lim: 6 exec/s: 20 rss: 74Mb L: 6/6 MS: 1 ChangeBit- 00:08:17.445 #21 NEW cov: 11117 ft: 17876 corp: 8/43b lim: 6 exec/s: 21 rss: 74Mb L: 6/6 MS: 1 CopyPart- 00:08:17.704 #22 NEW cov: 11124 ft: 18242 corp: 9/49b lim: 6 exec/s: 22 rss: 74Mb L: 6/6 MS: 1 CrossOver- 00:08:17.963 #23 NEW cov: 11124 ft: 18429 corp: 10/55b lim: 6 exec/s: 11 rss: 74Mb L: 6/6 MS: 1 ChangeByte- 00:08:17.963 #23 DONE cov: 11124 ft: 18429 corp: 10/55b lim: 6 exec/s: 11 rss: 74Mb 00:08:17.963 Done 23 runs in 2 second(s) 00:08:17.963 [2024-11-17 08:20:30.898884] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: disabling controller 00:08:18.223 08:20:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-0 /var/tmp/suppress_vfio_fuzz 00:08:18.223 08:20:31 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:18.223 08:20:31 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:18.223 08:20:31 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:18.223 08:20:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:18.223 08:20:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:18.223 08:20:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:18.223 08:20:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:18.223 08:20:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:18.223 08:20:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:18.223 08:20:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:18.223 08:20:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:18.223 08:20:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:18.223 08:20:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:18.223 08:20:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:18.223 08:20:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:18.223 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:18.223 08:20:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:18.223 08:20:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:18.223 08:20:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:18.223 [2024-11-17 08:20:31.185557] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:18.223 [2024-11-17 08:20:31.185629] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1001169 ] 00:08:18.223 [2024-11-17 08:20:31.256514] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.223 [2024-11-17 08:20:31.293621] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.482 INFO: Running with entropic power schedule (0xFF, 100). 00:08:18.482 INFO: Seed: 1143973589 00:08:18.482 INFO: Loaded 1 modules (381510 inline 8-bit counters): 381510 [0x2a38ccc, 0x2a95f12), 00:08:18.482 INFO: Loaded 1 PC tables (381510 PCs): 381510 [0x2a95f18,0x3068378), 00:08:18.482 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:18.482 INFO: A corpus is not provided, starting from an empty corpus 00:08:18.482 #2 INITED exec/s: 0 rss: 66Mb 00:08:18.482 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:18.483 This may also happen if the target rejected all inputs we tried so far 00:08:18.483 [2024-11-17 08:20:31.526098] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: enabling controller 00:08:18.483 [2024-11-17 08:20:31.581733] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:18.483 [2024-11-17 08:20:31.581758] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:18.483 [2024-11-17 08:20:31.581777] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:19.001 NEW_FUNC[1/670]: 0x459608 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:71 00:08:19.001 NEW_FUNC[2/670]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:19.001 #31 NEW cov: 11070 ft: 10958 corp: 2/5b lim: 4 exec/s: 0 rss: 71Mb L: 4/4 MS: 4 CopyPart-CrossOver-InsertByte-InsertByte- 00:08:19.001 [2024-11-17 08:20:32.073234] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:19.001 [2024-11-17 08:20:32.073268] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:19.001 [2024-11-17 08:20:32.073287] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:19.260 #48 NEW cov: 11093 ft: 13647 corp: 3/9b lim: 4 exec/s: 0 rss: 73Mb L: 4/4 MS: 2 CopyPart-CopyPart- 00:08:19.260 [2024-11-17 08:20:32.277654] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:19.260 [2024-11-17 08:20:32.277677] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:19.260 [2024-11-17 08:20:32.277699] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:19.260 NEW_FUNC[1/1]: 0x1be7258 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:19.260 #49 NEW cov: 11113 ft: 15045 corp: 4/13b lim: 4 exec/s: 0 rss: 74Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:19.519 [2024-11-17 08:20:32.479551] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:19.519 [2024-11-17 08:20:32.479575] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:19.519 [2024-11-17 08:20:32.479592] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:19.519 #50 NEW cov: 11113 ft: 15393 corp: 5/17b lim: 4 exec/s: 50 rss: 74Mb L: 4/4 MS: 1 CrossOver- 00:08:19.778 [2024-11-17 08:20:32.671706] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:19.778 [2024-11-17 08:20:32.671732] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:19.778 [2024-11-17 08:20:32.671749] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:19.778 #52 NEW cov: 11113 ft: 16711 corp: 6/21b lim: 4 exec/s: 52 rss: 74Mb L: 4/4 MS: 2 EraseBytes-InsertByte- 00:08:19.778 [2024-11-17 08:20:32.864810] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:19.778 [2024-11-17 08:20:32.864831] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:19.778 [2024-11-17 08:20:32.864848] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:20.036 #53 NEW cov: 11113 ft: 17123 corp: 7/25b lim: 4 exec/s: 53 rss: 74Mb L: 4/4 MS: 1 ChangeBit- 00:08:20.036 [2024-11-17 08:20:33.058892] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:20.037 [2024-11-17 08:20:33.058915] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:20.037 [2024-11-17 08:20:33.058931] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:20.037 #55 NEW cov: 11113 ft: 17452 corp: 8/29b lim: 4 exec/s: 55 rss: 74Mb L: 4/4 MS: 2 EraseBytes-CopyPart- 00:08:20.295 [2024-11-17 08:20:33.252013] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:20.295 [2024-11-17 08:20:33.252036] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:20.295 [2024-11-17 08:20:33.252054] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:20.295 #56 NEW cov: 11120 ft: 17506 corp: 9/33b lim: 4 exec/s: 56 rss: 74Mb L: 4/4 MS: 1 ChangeBit- 00:08:20.555 [2024-11-17 08:20:33.443991] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:20.555 [2024-11-17 08:20:33.444012] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:20.555 [2024-11-17 08:20:33.444029] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:20.555 #57 NEW cov: 11120 ft: 17535 corp: 10/37b lim: 4 exec/s: 28 rss: 74Mb L: 4/4 MS: 1 CMP- DE: "\377\000"- 00:08:20.555 #57 DONE cov: 11120 ft: 17535 corp: 10/37b lim: 4 exec/s: 28 rss: 74Mb 00:08:20.555 ###### Recommended dictionary. ###### 00:08:20.555 "\377\000" # Uses: 0 00:08:20.555 ###### End of recommended dictionary. ###### 00:08:20.555 Done 57 runs in 2 second(s) 00:08:20.555 [2024-11-17 08:20:33.581877] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: disabling controller 00:08:20.818 08:20:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-1 /var/tmp/suppress_vfio_fuzz 00:08:20.818 08:20:33 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:20.818 08:20:33 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:20.818 08:20:33 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:20.818 08:20:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:20.818 08:20:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:20.818 08:20:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:20.818 08:20:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:20.818 08:20:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:20.818 08:20:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:20.818 08:20:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:20.818 08:20:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:20.818 08:20:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:20.818 08:20:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:20.818 08:20:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:20.818 08:20:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:20.819 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:20.819 08:20:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:20.819 08:20:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:20.819 08:20:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:20.819 [2024-11-17 08:20:33.866923] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:20.819 [2024-11-17 08:20:33.866996] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1001625 ] 00:08:20.819 [2024-11-17 08:20:33.938525] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.180 [2024-11-17 08:20:33.978548] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.180 INFO: Running with entropic power schedule (0xFF, 100). 00:08:21.180 INFO: Seed: 3826985071 00:08:21.180 INFO: Loaded 1 modules (381510 inline 8-bit counters): 381510 [0x2a38ccc, 0x2a95f12), 00:08:21.180 INFO: Loaded 1 PC tables (381510 PCs): 381510 [0x2a95f18,0x3068378), 00:08:21.180 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:21.180 INFO: A corpus is not provided, starting from an empty corpus 00:08:21.180 #2 INITED exec/s: 0 rss: 66Mb 00:08:21.180 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:21.180 This may also happen if the target rejected all inputs we tried so far 00:08:21.180 [2024-11-17 08:20:34.217106] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: enabling controller 00:08:21.180 [2024-11-17 08:20:34.274216] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:21.699 NEW_FUNC[1/669]: 0x459ff8 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:103 00:08:21.699 NEW_FUNC[2/669]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:21.699 #24 NEW cov: 11062 ft: 10807 corp: 2/9b lim: 8 exec/s: 0 rss: 72Mb L: 8/8 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:21.699 [2024-11-17 08:20:34.755515] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:21.959 #25 NEW cov: 11079 ft: 14253 corp: 3/17b lim: 8 exec/s: 0 rss: 73Mb L: 8/8 MS: 1 ShuffleBytes- 00:08:21.959 [2024-11-17 08:20:34.954685] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:21.959 NEW_FUNC[1/1]: 0x1be7258 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:21.959 #26 NEW cov: 11096 ft: 15951 corp: 4/25b lim: 8 exec/s: 0 rss: 74Mb L: 8/8 MS: 1 ChangeByte- 00:08:22.218 [2024-11-17 08:20:35.156498] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:22.218 #30 NEW cov: 11096 ft: 16741 corp: 5/33b lim: 8 exec/s: 30 rss: 74Mb L: 8/8 MS: 4 ChangeByte-InsertByte-ChangeBit-InsertRepeatedBytes- 00:08:22.477 [2024-11-17 08:20:35.357151] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:22.477 #31 NEW cov: 11096 ft: 17188 corp: 6/41b lim: 8 exec/s: 31 rss: 74Mb L: 8/8 MS: 1 ChangeBit- 00:08:22.477 [2024-11-17 08:20:35.551133] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:22.737 #32 NEW cov: 11096 ft: 17571 corp: 7/49b lim: 8 exec/s: 32 rss: 74Mb L: 8/8 MS: 1 ChangeBinInt- 00:08:22.737 [2024-11-17 08:20:35.741517] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:22.737 #33 NEW cov: 11096 ft: 17697 corp: 8/57b lim: 8 exec/s: 33 rss: 74Mb L: 8/8 MS: 1 ChangeByte- 00:08:22.996 [2024-11-17 08:20:35.931372] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:22.996 #34 NEW cov: 11103 ft: 17799 corp: 9/65b lim: 8 exec/s: 34 rss: 74Mb L: 8/8 MS: 1 ChangeBinInt- 00:08:22.996 [2024-11-17 08:20:36.122652] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:23.255 #35 NEW cov: 11103 ft: 17970 corp: 10/73b lim: 8 exec/s: 17 rss: 74Mb L: 8/8 MS: 1 ChangeByte- 00:08:23.255 #35 DONE cov: 11103 ft: 17970 corp: 10/73b lim: 8 exec/s: 17 rss: 74Mb 00:08:23.255 Done 35 runs in 2 second(s) 00:08:23.255 [2024-11-17 08:20:36.259869] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: disabling controller 00:08:23.515 08:20:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-2 /var/tmp/suppress_vfio_fuzz 00:08:23.515 08:20:36 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:23.515 08:20:36 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.515 08:20:36 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:23.515 08:20:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:23.515 08:20:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:23.515 08:20:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:23.515 08:20:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:23.515 08:20:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:23.515 08:20:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:23.515 08:20:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:23.515 08:20:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:23.515 08:20:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:23.515 08:20:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:23.515 08:20:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:23.515 08:20:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:23.516 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:23.516 08:20:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:23.516 08:20:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:23.516 08:20:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:23.516 [2024-11-17 08:20:36.539437] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:23.516 [2024-11-17 08:20:36.539509] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1002002 ] 00:08:23.516 [2024-11-17 08:20:36.613105] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.516 [2024-11-17 08:20:36.650670] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.776 INFO: Running with entropic power schedule (0xFF, 100). 00:08:23.776 INFO: Seed: 2207016687 00:08:23.776 INFO: Loaded 1 modules (381510 inline 8-bit counters): 381510 [0x2a38ccc, 0x2a95f12), 00:08:23.776 INFO: Loaded 1 PC tables (381510 PCs): 381510 [0x2a95f18,0x3068378), 00:08:23.776 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:23.776 INFO: A corpus is not provided, starting from an empty corpus 00:08:23.776 #2 INITED exec/s: 0 rss: 66Mb 00:08:23.776 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:23.776 This may also happen if the target rejected all inputs we tried so far 00:08:23.776 [2024-11-17 08:20:36.899477] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: enabling controller 00:08:24.295 NEW_FUNC[1/669]: 0x45a6e8 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:124 00:08:24.295 NEW_FUNC[2/669]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:24.295 #102 NEW cov: 11068 ft: 11023 corp: 2/33b lim: 32 exec/s: 0 rss: 71Mb L: 32/32 MS: 5 ChangeBit-ChangeBinInt-InsertRepeatedBytes-ChangeBit-InsertRepeatedBytes- 00:08:24.554 #103 NEW cov: 11083 ft: 13718 corp: 3/65b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 CrossOver- 00:08:24.814 NEW_FUNC[1/1]: 0x1be7258 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:24.814 #104 NEW cov: 11100 ft: 15006 corp: 4/97b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 CrossOver- 00:08:24.814 #105 NEW cov: 11100 ft: 15498 corp: 5/129b lim: 32 exec/s: 105 rss: 74Mb L: 32/32 MS: 1 ChangeBit- 00:08:25.073 #106 NEW cov: 11100 ft: 16044 corp: 6/161b lim: 32 exec/s: 106 rss: 74Mb L: 32/32 MS: 1 ChangeByte- 00:08:25.332 #110 NEW cov: 11100 ft: 16447 corp: 7/193b lim: 32 exec/s: 110 rss: 74Mb L: 32/32 MS: 4 CrossOver-ShuffleBytes-InsertByte-CopyPart- 00:08:25.332 #116 NEW cov: 11100 ft: 16478 corp: 8/225b lim: 32 exec/s: 116 rss: 74Mb L: 32/32 MS: 1 ChangeByte- 00:08:25.591 #117 NEW cov: 11100 ft: 16492 corp: 9/257b lim: 32 exec/s: 117 rss: 74Mb L: 32/32 MS: 1 ChangeByte- 00:08:25.850 #118 NEW cov: 11107 ft: 16641 corp: 10/289b lim: 32 exec/s: 118 rss: 74Mb L: 32/32 MS: 1 ChangeByte- 00:08:26.110 #119 NEW cov: 11107 ft: 16700 corp: 11/321b lim: 32 exec/s: 59 rss: 74Mb L: 32/32 MS: 1 CMP- DE: "c\000\000\000"- 00:08:26.110 #119 DONE cov: 11107 ft: 16700 corp: 11/321b lim: 32 exec/s: 59 rss: 74Mb 00:08:26.110 ###### Recommended dictionary. ###### 00:08:26.110 "c\000\000\000" # Uses: 0 00:08:26.110 ###### End of recommended dictionary. ###### 00:08:26.110 Done 119 runs in 2 second(s) 00:08:26.110 [2024-11-17 08:20:39.021884] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: disabling controller 00:08:26.369 08:20:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-3 /var/tmp/suppress_vfio_fuzz 00:08:26.369 08:20:39 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:26.369 08:20:39 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:26.369 08:20:39 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:26.369 08:20:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:26.369 08:20:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:26.369 08:20:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:26.369 08:20:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:26.369 08:20:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:26.369 08:20:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:26.369 08:20:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:26.369 08:20:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:26.369 08:20:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:26.369 08:20:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:26.369 08:20:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:26.369 08:20:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:26.369 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:26.369 08:20:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:26.369 08:20:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:26.369 08:20:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:26.369 [2024-11-17 08:20:39.313609] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:26.369 [2024-11-17 08:20:39.313682] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1002532 ] 00:08:26.369 [2024-11-17 08:20:39.385657] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.369 [2024-11-17 08:20:39.423854] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.628 INFO: Running with entropic power schedule (0xFF, 100). 00:08:26.628 INFO: Seed: 684041119 00:08:26.628 INFO: Loaded 1 modules (381510 inline 8-bit counters): 381510 [0x2a38ccc, 0x2a95f12), 00:08:26.628 INFO: Loaded 1 PC tables (381510 PCs): 381510 [0x2a95f18,0x3068378), 00:08:26.628 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:26.628 INFO: A corpus is not provided, starting from an empty corpus 00:08:26.628 #2 INITED exec/s: 0 rss: 66Mb 00:08:26.628 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:26.628 This may also happen if the target rejected all inputs we tried so far 00:08:26.628 [2024-11-17 08:20:39.663099] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: enabling controller 00:08:27.145 NEW_FUNC[1/669]: 0x45af68 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:144 00:08:27.146 NEW_FUNC[2/669]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:27.146 #120 NEW cov: 11071 ft: 10977 corp: 2/33b lim: 32 exec/s: 0 rss: 71Mb L: 32/32 MS: 3 ShuffleBytes-InsertRepeatedBytes-InsertRepeatedBytes- 00:08:27.146 #126 NEW cov: 11085 ft: 14371 corp: 3/65b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 CopyPart- 00:08:27.404 #128 NEW cov: 11085 ft: 15313 corp: 4/97b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 2 EraseBytes-CrossOver- 00:08:27.404 NEW_FUNC[1/1]: 0x1be7258 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:27.404 #134 NEW cov: 11102 ft: 15506 corp: 5/129b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:08:27.663 #140 NEW cov: 11102 ft: 15588 corp: 6/161b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 ChangeBit- 00:08:27.663 #146 NEW cov: 11102 ft: 16366 corp: 7/193b lim: 32 exec/s: 146 rss: 74Mb L: 32/32 MS: 1 ChangeByte- 00:08:27.922 #152 NEW cov: 11102 ft: 16906 corp: 8/225b lim: 32 exec/s: 152 rss: 74Mb L: 32/32 MS: 1 ChangeByte- 00:08:27.922 #153 NEW cov: 11102 ft: 17034 corp: 9/257b lim: 32 exec/s: 153 rss: 74Mb L: 32/32 MS: 1 ChangeByte- 00:08:27.922 #154 NEW cov: 11102 ft: 17275 corp: 10/289b lim: 32 exec/s: 154 rss: 74Mb L: 32/32 MS: 1 CrossOver- 00:08:28.181 #155 NEW cov: 11102 ft: 17327 corp: 11/321b lim: 32 exec/s: 155 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:08:28.181 #156 NEW cov: 11102 ft: 17393 corp: 12/353b lim: 32 exec/s: 156 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:08:28.439 #157 NEW cov: 11102 ft: 17691 corp: 13/385b lim: 32 exec/s: 157 rss: 75Mb L: 32/32 MS: 1 CopyPart- 00:08:28.439 #158 NEW cov: 11109 ft: 18340 corp: 14/417b lim: 32 exec/s: 158 rss: 75Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:28.699 #159 NEW cov: 11109 ft: 18371 corp: 15/449b lim: 32 exec/s: 79 rss: 75Mb L: 32/32 MS: 1 ChangeBit- 00:08:28.699 #159 DONE cov: 11109 ft: 18371 corp: 15/449b lim: 32 exec/s: 79 rss: 75Mb 00:08:28.699 Done 159 runs in 2 second(s) 00:08:28.699 [2024-11-17 08:20:41.649879] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: disabling controller 00:08:28.958 08:20:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-4 /var/tmp/suppress_vfio_fuzz 00:08:28.958 08:20:41 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:28.958 08:20:41 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:28.958 08:20:41 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:28.958 08:20:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:28.958 08:20:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:28.958 08:20:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:28.958 08:20:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:28.958 08:20:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:28.958 08:20:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:28.958 08:20:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:28.958 08:20:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:28.958 08:20:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:28.958 08:20:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:28.958 08:20:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:28.958 08:20:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:28.958 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:28.958 08:20:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:28.958 08:20:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:28.958 08:20:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:28.958 [2024-11-17 08:20:41.933639] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:28.958 [2024-11-17 08:20:41.933719] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1003066 ] 00:08:28.958 [2024-11-17 08:20:42.004448] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.958 [2024-11-17 08:20:42.042112] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.217 INFO: Running with entropic power schedule (0xFF, 100). 00:08:29.217 INFO: Seed: 3299049741 00:08:29.217 INFO: Loaded 1 modules (381510 inline 8-bit counters): 381510 [0x2a38ccc, 0x2a95f12), 00:08:29.217 INFO: Loaded 1 PC tables (381510 PCs): 381510 [0x2a95f18,0x3068378), 00:08:29.217 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:29.217 INFO: A corpus is not provided, starting from an empty corpus 00:08:29.217 #2 INITED exec/s: 0 rss: 66Mb 00:08:29.217 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:29.217 This may also happen if the target rejected all inputs we tried so far 00:08:29.217 [2024-11-17 08:20:42.271166] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: enabling controller 00:08:29.217 [2024-11-17 08:20:42.318765] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:29.217 [2024-11-17 08:20:42.318800] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:29.734 NEW_FUNC[1/670]: 0x45b968 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:171 00:08:29.735 NEW_FUNC[2/670]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:29.735 #162 NEW cov: 11081 ft: 11042 corp: 2/14b lim: 13 exec/s: 0 rss: 71Mb L: 13/13 MS: 5 CopyPart-CopyPart-CopyPart-CrossOver-CMP- DE: "\001\212\226\244\310I\216\002"- 00:08:29.735 [2024-11-17 08:20:42.805734] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:29.735 [2024-11-17 08:20:42.805777] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:29.993 #163 NEW cov: 11098 ft: 14141 corp: 3/27b lim: 13 exec/s: 0 rss: 73Mb L: 13/13 MS: 1 ChangeByte- 00:08:29.993 [2024-11-17 08:20:42.999124] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:29.993 [2024-11-17 08:20:42.999156] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:29.993 NEW_FUNC[1/1]: 0x1be7258 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:29.993 #164 NEW cov: 11115 ft: 16313 corp: 4/40b lim: 13 exec/s: 0 rss: 74Mb L: 13/13 MS: 1 CopyPart- 00:08:30.253 [2024-11-17 08:20:43.202661] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:30.253 [2024-11-17 08:20:43.202691] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:30.253 #165 NEW cov: 11115 ft: 17204 corp: 5/53b lim: 13 exec/s: 165 rss: 74Mb L: 13/13 MS: 1 PersAutoDict- DE: "\001\212\226\244\310I\216\002"- 00:08:30.512 [2024-11-17 08:20:43.406672] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:30.512 [2024-11-17 08:20:43.406709] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:30.512 #171 NEW cov: 11115 ft: 17514 corp: 6/66b lim: 13 exec/s: 171 rss: 74Mb L: 13/13 MS: 1 ChangeBinInt- 00:08:30.512 [2024-11-17 08:20:43.598650] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:30.512 [2024-11-17 08:20:43.598680] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:30.771 #172 NEW cov: 11115 ft: 17678 corp: 7/79b lim: 13 exec/s: 172 rss: 74Mb L: 13/13 MS: 1 ChangeByte- 00:08:30.771 [2024-11-17 08:20:43.792009] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:30.771 [2024-11-17 08:20:43.792038] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:30.771 #178 NEW cov: 11115 ft: 17854 corp: 8/92b lim: 13 exec/s: 178 rss: 74Mb L: 13/13 MS: 1 PersAutoDict- DE: "\001\212\226\244\310I\216\002"- 00:08:31.031 [2024-11-17 08:20:43.985946] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:31.031 [2024-11-17 08:20:43.985976] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:31.031 #179 NEW cov: 11122 ft: 18083 corp: 9/105b lim: 13 exec/s: 179 rss: 74Mb L: 13/13 MS: 1 CopyPart- 00:08:31.290 [2024-11-17 08:20:44.178558] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:31.290 [2024-11-17 08:20:44.178588] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:31.290 #180 NEW cov: 11122 ft: 18213 corp: 10/118b lim: 13 exec/s: 90 rss: 74Mb L: 13/13 MS: 1 ChangeBit- 00:08:31.290 #180 DONE cov: 11122 ft: 18213 corp: 10/118b lim: 13 exec/s: 90 rss: 74Mb 00:08:31.290 ###### Recommended dictionary. ###### 00:08:31.290 "\001\212\226\244\310I\216\002" # Uses: 2 00:08:31.290 ###### End of recommended dictionary. ###### 00:08:31.290 Done 180 runs in 2 second(s) 00:08:31.290 [2024-11-17 08:20:44.314878] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: disabling controller 00:08:31.550 08:20:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-5 /var/tmp/suppress_vfio_fuzz 00:08:31.550 08:20:44 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:31.550 08:20:44 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:31.550 08:20:44 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:31.550 08:20:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:31.550 08:20:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:31.550 08:20:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:31.550 08:20:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:31.550 08:20:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:31.550 08:20:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:31.550 08:20:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:31.550 08:20:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:31.550 08:20:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:31.550 08:20:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:31.550 08:20:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:31.550 08:20:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:31.550 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:31.550 08:20:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:31.550 08:20:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:31.550 08:20:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:31.550 [2024-11-17 08:20:44.588752] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 23.11.0 initialization... 00:08:31.550 [2024-11-17 08:20:44.588823] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1003511 ] 00:08:31.550 [2024-11-17 08:20:44.652799] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:31.809 [2024-11-17 08:20:44.691688] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.809 INFO: Running with entropic power schedule (0xFF, 100). 00:08:31.809 INFO: Seed: 1658095092 00:08:31.809 INFO: Loaded 1 modules (381510 inline 8-bit counters): 381510 [0x2a38ccc, 0x2a95f12), 00:08:31.809 INFO: Loaded 1 PC tables (381510 PCs): 381510 [0x2a95f18,0x3068378), 00:08:31.809 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:31.809 INFO: A corpus is not provided, starting from an empty corpus 00:08:31.809 #2 INITED exec/s: 0 rss: 67Mb 00:08:31.809 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:31.809 This may also happen if the target rejected all inputs we tried so far 00:08:31.809 [2024-11-17 08:20:44.925281] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: enabling controller 00:08:32.068 [2024-11-17 08:20:44.971780] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:32.068 [2024-11-17 08:20:44.971812] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:32.328 NEW_FUNC[1/665]: 0x45c658 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:32.328 NEW_FUNC[2/665]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:32.328 #32 NEW cov: 11037 ft: 11041 corp: 2/10b lim: 9 exec/s: 0 rss: 72Mb L: 9/9 MS: 5 InsertRepeatedBytes-ShuffleBytes-ShuffleBytes-ChangeBit-InsertByte- 00:08:32.328 [2024-11-17 08:20:45.462064] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:32.328 [2024-11-17 08:20:45.462109] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:32.587 NEW_FUNC[1/5]: 0x1596dc8 in map_one /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:731 00:08:32.587 NEW_FUNC[2/5]: 0x15bbd58 in nvmf_vfio_user_sq_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:5660 00:08:32.587 #33 NEW cov: 11090 ft: 14144 corp: 3/19b lim: 9 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 CrossOver- 00:08:32.587 [2024-11-17 08:20:45.672460] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:32.587 [2024-11-17 08:20:45.672493] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:32.846 NEW_FUNC[1/1]: 0x1be7258 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:32.846 #34 NEW cov: 11107 ft: 15488 corp: 4/28b lim: 9 exec/s: 0 rss: 74Mb L: 9/9 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:32.846 [2024-11-17 08:20:45.879079] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:32.846 [2024-11-17 08:20:45.879109] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:33.106 #40 NEW cov: 11107 ft: 16807 corp: 5/37b lim: 9 exec/s: 40 rss: 74Mb L: 9/9 MS: 1 ChangeBinInt- 00:08:33.106 [2024-11-17 08:20:46.088774] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:33.106 [2024-11-17 08:20:46.088805] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:33.106 #41 NEW cov: 11107 ft: 17448 corp: 6/46b lim: 9 exec/s: 41 rss: 74Mb L: 9/9 MS: 1 CopyPart- 00:08:33.365 [2024-11-17 08:20:46.288180] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:33.366 [2024-11-17 08:20:46.288210] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:33.366 #47 NEW cov: 11107 ft: 17637 corp: 7/55b lim: 9 exec/s: 47 rss: 74Mb L: 9/9 MS: 1 ChangeByte- 00:08:33.366 [2024-11-17 08:20:46.488286] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:33.366 [2024-11-17 08:20:46.488314] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:33.625 #58 NEW cov: 11107 ft: 17860 corp: 8/64b lim: 9 exec/s: 58 rss: 74Mb L: 9/9 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:33.625 [2024-11-17 08:20:46.696917] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:33.625 [2024-11-17 08:20:46.696948] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:33.884 #59 NEW cov: 11114 ft: 18183 corp: 9/73b lim: 9 exec/s: 59 rss: 75Mb L: 9/9 MS: 1 CopyPart- 00:08:33.885 [2024-11-17 08:20:46.896459] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:33.885 [2024-11-17 08:20:46.896488] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:33.885 #60 NEW cov: 11114 ft: 18526 corp: 10/82b lim: 9 exec/s: 30 rss: 75Mb L: 9/9 MS: 1 ChangeBinInt- 00:08:33.885 #60 DONE cov: 11114 ft: 18526 corp: 10/82b lim: 9 exec/s: 30 rss: 75Mb 00:08:33.885 ###### Recommended dictionary. ###### 00:08:33.885 "\000\000\000\000\000\000\000\000" # Uses: 1 00:08:33.885 ###### End of recommended dictionary. ###### 00:08:33.885 Done 60 runs in 2 second(s) 00:08:34.144 [2024-11-17 08:20:47.031877] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: disabling controller 00:08:34.144 08:20:47 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-6 /var/tmp/suppress_vfio_fuzz 00:08:34.144 08:20:47 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:34.144 08:20:47 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:34.144 08:20:47 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:08:34.144 00:08:34.144 real 0m19.230s 00:08:34.144 user 0m26.913s 00:08:34.144 sys 0m1.880s 00:08:34.144 08:20:47 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:34.144 08:20:47 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:34.144 ************************************ 00:08:34.144 END TEST vfio_llvm_fuzz 00:08:34.144 ************************************ 00:08:34.403 00:08:34.403 real 1m22.057s 00:08:34.403 user 2m6.260s 00:08:34.403 sys 0m9.087s 00:08:34.403 08:20:47 llvm_fuzz -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:34.403 08:20:47 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:34.403 ************************************ 00:08:34.403 END TEST llvm_fuzz 00:08:34.403 ************************************ 00:08:34.403 08:20:47 -- spdk/autotest.sh@374 -- # [[ '' -eq 1 ]] 00:08:34.403 08:20:47 -- spdk/autotest.sh@381 -- # trap - SIGINT SIGTERM EXIT 00:08:34.403 08:20:47 -- spdk/autotest.sh@383 -- # timing_enter post_cleanup 00:08:34.403 08:20:47 -- common/autotest_common.sh@724 -- # xtrace_disable 00:08:34.403 08:20:47 -- common/autotest_common.sh@10 -- # set +x 00:08:34.403 08:20:47 -- spdk/autotest.sh@384 -- # autotest_cleanup 00:08:34.403 08:20:47 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:08:34.403 08:20:47 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:08:34.403 08:20:47 -- common/autotest_common.sh@10 -- # set +x 00:08:40.989 INFO: APP EXITING 00:08:40.989 INFO: killing all VMs 00:08:40.989 INFO: killing vhost app 00:08:40.989 INFO: EXIT DONE 00:08:43.526 Waiting for block devices as requested 00:08:43.526 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:43.526 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:43.526 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:43.526 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:43.526 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:43.526 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:43.785 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:43.785 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:43.785 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:44.045 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:44.045 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:44.045 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:44.305 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:44.305 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:44.305 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:44.564 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:44.564 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:08:48.759 Cleaning 00:08:48.759 Removing: /dev/shm/spdk_tgt_trace.pid976187 00:08:48.759 Removing: /var/run/dpdk/spdk_pid1000004 00:08:48.759 Removing: /var/run/dpdk/spdk_pid1000632 00:08:48.759 Removing: /var/run/dpdk/spdk_pid1001169 00:08:48.759 Removing: /var/run/dpdk/spdk_pid1001625 00:08:48.759 Removing: /var/run/dpdk/spdk_pid1002002 00:08:48.759 Removing: /var/run/dpdk/spdk_pid1002532 00:08:48.759 Removing: /var/run/dpdk/spdk_pid1003066 00:08:48.759 Removing: /var/run/dpdk/spdk_pid1003511 00:08:48.759 Removing: /var/run/dpdk/spdk_pid973724 00:08:48.759 Removing: /var/run/dpdk/spdk_pid974910 00:08:48.759 Removing: /var/run/dpdk/spdk_pid976187 00:08:48.759 Removing: /var/run/dpdk/spdk_pid976644 00:08:48.759 Removing: /var/run/dpdk/spdk_pid977729 00:08:48.759 Removing: /var/run/dpdk/spdk_pid977759 00:08:48.759 Removing: /var/run/dpdk/spdk_pid978858 00:08:48.759 Removing: /var/run/dpdk/spdk_pid978871 00:08:48.759 Removing: /var/run/dpdk/spdk_pid979307 00:08:48.759 Removing: /var/run/dpdk/spdk_pid979626 00:08:48.759 Removing: /var/run/dpdk/spdk_pid979947 00:08:48.759 Removing: /var/run/dpdk/spdk_pid980289 00:08:48.759 Removing: /var/run/dpdk/spdk_pid980516 00:08:48.759 Removing: /var/run/dpdk/spdk_pid980672 00:08:48.759 Removing: /var/run/dpdk/spdk_pid980941 00:08:48.759 Removing: /var/run/dpdk/spdk_pid981262 00:08:48.759 Removing: /var/run/dpdk/spdk_pid982118 00:08:48.759 Removing: /var/run/dpdk/spdk_pid985237 00:08:48.759 Removing: /var/run/dpdk/spdk_pid985543 00:08:48.759 Removing: /var/run/dpdk/spdk_pid985622 00:08:48.759 Removing: /var/run/dpdk/spdk_pid985792 00:08:48.759 Removing: /var/run/dpdk/spdk_pid986244 00:08:48.759 Removing: /var/run/dpdk/spdk_pid986412 00:08:48.759 Removing: /var/run/dpdk/spdk_pid986905 00:08:48.759 Removing: /var/run/dpdk/spdk_pid987028 00:08:48.759 Removing: /var/run/dpdk/spdk_pid987331 00:08:48.759 Removing: /var/run/dpdk/spdk_pid987340 00:08:48.759 Removing: /var/run/dpdk/spdk_pid987626 00:08:48.759 Removing: /var/run/dpdk/spdk_pid987639 00:08:48.759 Removing: /var/run/dpdk/spdk_pid988274 00:08:48.759 Removing: /var/run/dpdk/spdk_pid988428 00:08:48.759 Removing: /var/run/dpdk/spdk_pid988593 00:08:48.759 Removing: /var/run/dpdk/spdk_pid988917 00:08:48.759 Removing: /var/run/dpdk/spdk_pid989438 00:08:48.759 Removing: /var/run/dpdk/spdk_pid989960 00:08:48.759 Removing: /var/run/dpdk/spdk_pid990433 00:08:48.759 Removing: /var/run/dpdk/spdk_pid990776 00:08:48.759 Removing: /var/run/dpdk/spdk_pid991308 00:08:48.759 Removing: /var/run/dpdk/spdk_pid991630 00:08:48.759 Removing: /var/run/dpdk/spdk_pid992134 00:08:48.759 Removing: /var/run/dpdk/spdk_pid992665 00:08:48.759 Removing: /var/run/dpdk/spdk_pid992959 00:08:48.759 Removing: /var/run/dpdk/spdk_pid993487 00:08:48.759 Removing: /var/run/dpdk/spdk_pid993886 00:08:48.759 Removing: /var/run/dpdk/spdk_pid994310 00:08:48.759 Removing: /var/run/dpdk/spdk_pid994836 00:08:48.759 Removing: /var/run/dpdk/spdk_pid995132 00:08:48.759 Removing: /var/run/dpdk/spdk_pid995657 00:08:48.759 Removing: /var/run/dpdk/spdk_pid996115 00:08:48.759 Removing: /var/run/dpdk/spdk_pid996478 00:08:48.759 Removing: /var/run/dpdk/spdk_pid997014 00:08:48.759 Removing: /var/run/dpdk/spdk_pid997374 00:08:48.759 Removing: /var/run/dpdk/spdk_pid997835 00:08:48.759 Removing: /var/run/dpdk/spdk_pid998364 00:08:48.759 Removing: /var/run/dpdk/spdk_pid998660 00:08:48.759 Removing: /var/run/dpdk/spdk_pid999186 00:08:48.759 Removing: /var/run/dpdk/spdk_pid999624 00:08:48.759 Clean 00:08:48.759 08:21:01 -- common/autotest_common.sh@1451 -- # return 0 00:08:48.759 08:21:01 -- spdk/autotest.sh@385 -- # timing_exit post_cleanup 00:08:48.759 08:21:01 -- common/autotest_common.sh@730 -- # xtrace_disable 00:08:48.759 08:21:01 -- common/autotest_common.sh@10 -- # set +x 00:08:48.759 08:21:01 -- spdk/autotest.sh@387 -- # timing_exit autotest 00:08:48.759 08:21:01 -- common/autotest_common.sh@730 -- # xtrace_disable 00:08:48.759 08:21:01 -- common/autotest_common.sh@10 -- # set +x 00:08:48.759 08:21:01 -- spdk/autotest.sh@388 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:08:48.759 08:21:01 -- spdk/autotest.sh@390 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:08:48.759 08:21:01 -- spdk/autotest.sh@390 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:08:48.759 08:21:01 -- spdk/autotest.sh@392 -- # [[ y == y ]] 00:08:48.759 08:21:01 -- spdk/autotest.sh@394 -- # hostname 00:08:48.759 08:21:01 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:08:48.759 geninfo: WARNING: invalid characters removed from testname! 00:08:52.052 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcda 00:08:58.639 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcda 00:09:01.177 08:21:14 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:09.380 08:21:21 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:13.573 08:21:26 -- spdk/autotest.sh@400 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:18.849 08:21:31 -- spdk/autotest.sh@401 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:24.125 08:21:36 -- spdk/autotest.sh@402 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:29.400 08:21:42 -- spdk/autotest.sh@403 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:34.672 08:21:47 -- spdk/autotest.sh@404 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:09:34.672 08:21:47 -- common/autotest_common.sh@1680 -- $ [[ y == y ]] 00:09:34.672 08:21:47 -- common/autotest_common.sh@1681 -- $ lcov --version 00:09:34.672 08:21:47 -- common/autotest_common.sh@1681 -- $ awk '{print $NF}' 00:09:34.672 08:21:47 -- common/autotest_common.sh@1681 -- $ lt 1.15 2 00:09:34.672 08:21:47 -- scripts/common.sh@373 -- $ cmp_versions 1.15 '<' 2 00:09:34.672 08:21:47 -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:09:34.672 08:21:47 -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:09:34.672 08:21:47 -- scripts/common.sh@336 -- $ IFS=.-: 00:09:34.672 08:21:47 -- scripts/common.sh@336 -- $ read -ra ver1 00:09:34.672 08:21:47 -- scripts/common.sh@337 -- $ IFS=.-: 00:09:34.672 08:21:47 -- scripts/common.sh@337 -- $ read -ra ver2 00:09:34.672 08:21:47 -- scripts/common.sh@338 -- $ local 'op=<' 00:09:34.672 08:21:47 -- scripts/common.sh@340 -- $ ver1_l=2 00:09:34.672 08:21:47 -- scripts/common.sh@341 -- $ ver2_l=1 00:09:34.672 08:21:47 -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:09:34.672 08:21:47 -- scripts/common.sh@344 -- $ case "$op" in 00:09:34.672 08:21:47 -- scripts/common.sh@345 -- $ : 1 00:09:34.672 08:21:47 -- scripts/common.sh@364 -- $ (( v = 0 )) 00:09:34.672 08:21:47 -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:34.672 08:21:47 -- scripts/common.sh@365 -- $ decimal 1 00:09:34.672 08:21:47 -- scripts/common.sh@353 -- $ local d=1 00:09:34.672 08:21:47 -- scripts/common.sh@354 -- $ [[ 1 =~ ^[0-9]+$ ]] 00:09:34.672 08:21:47 -- scripts/common.sh@355 -- $ echo 1 00:09:34.672 08:21:47 -- scripts/common.sh@365 -- $ ver1[v]=1 00:09:34.672 08:21:47 -- scripts/common.sh@366 -- $ decimal 2 00:09:34.672 08:21:47 -- scripts/common.sh@353 -- $ local d=2 00:09:34.672 08:21:47 -- scripts/common.sh@354 -- $ [[ 2 =~ ^[0-9]+$ ]] 00:09:34.672 08:21:47 -- scripts/common.sh@355 -- $ echo 2 00:09:34.672 08:21:47 -- scripts/common.sh@366 -- $ ver2[v]=2 00:09:34.672 08:21:47 -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:09:34.672 08:21:47 -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:09:34.672 08:21:47 -- scripts/common.sh@368 -- $ return 0 00:09:34.672 08:21:47 -- common/autotest_common.sh@1682 -- $ lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:34.672 08:21:47 -- common/autotest_common.sh@1694 -- $ export 'LCOV_OPTS= 00:09:34.672 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:34.672 --rc genhtml_branch_coverage=1 00:09:34.672 --rc genhtml_function_coverage=1 00:09:34.672 --rc genhtml_legend=1 00:09:34.672 --rc geninfo_all_blocks=1 00:09:34.672 --rc geninfo_unexecuted_blocks=1 00:09:34.672 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:34.672 ' 00:09:34.672 08:21:47 -- common/autotest_common.sh@1694 -- $ LCOV_OPTS=' 00:09:34.672 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:34.672 --rc genhtml_branch_coverage=1 00:09:34.672 --rc genhtml_function_coverage=1 00:09:34.672 --rc genhtml_legend=1 00:09:34.672 --rc geninfo_all_blocks=1 00:09:34.672 --rc geninfo_unexecuted_blocks=1 00:09:34.672 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:34.672 ' 00:09:34.672 08:21:47 -- common/autotest_common.sh@1695 -- $ export 'LCOV=lcov 00:09:34.672 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:34.672 --rc genhtml_branch_coverage=1 00:09:34.672 --rc genhtml_function_coverage=1 00:09:34.672 --rc genhtml_legend=1 00:09:34.672 --rc geninfo_all_blocks=1 00:09:34.672 --rc geninfo_unexecuted_blocks=1 00:09:34.672 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:34.672 ' 00:09:34.672 08:21:47 -- common/autotest_common.sh@1695 -- $ LCOV='lcov 00:09:34.672 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:34.672 --rc genhtml_branch_coverage=1 00:09:34.672 --rc genhtml_function_coverage=1 00:09:34.672 --rc genhtml_legend=1 00:09:34.672 --rc geninfo_all_blocks=1 00:09:34.672 --rc geninfo_unexecuted_blocks=1 00:09:34.672 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:34.672 ' 00:09:34.672 08:21:47 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:34.672 08:21:47 -- scripts/common.sh@15 -- $ shopt -s extglob 00:09:34.672 08:21:47 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:34.672 08:21:47 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:34.672 08:21:47 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:34.672 08:21:47 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:34.672 08:21:47 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:34.672 08:21:47 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:34.672 08:21:47 -- paths/export.sh@5 -- $ export PATH 00:09:34.672 08:21:47 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:34.672 08:21:47 -- common/autobuild_common.sh@478 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:34.672 08:21:47 -- common/autobuild_common.sh@479 -- $ date +%s 00:09:34.672 08:21:47 -- common/autobuild_common.sh@479 -- $ mktemp -dt spdk_1731828107.XXXXXX 00:09:34.672 08:21:47 -- common/autobuild_common.sh@479 -- $ SPDK_WORKSPACE=/tmp/spdk_1731828107.6A6SsK 00:09:34.672 08:21:47 -- common/autobuild_common.sh@481 -- $ [[ -n '' ]] 00:09:34.672 08:21:47 -- common/autobuild_common.sh@485 -- $ '[' -n v23.11 ']' 00:09:34.672 08:21:47 -- common/autobuild_common.sh@486 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:34.672 08:21:47 -- common/autobuild_common.sh@486 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:09:34.672 08:21:47 -- common/autobuild_common.sh@492 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:34.672 08:21:47 -- common/autobuild_common.sh@494 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:34.672 08:21:47 -- common/autobuild_common.sh@495 -- $ get_config_params 00:09:34.672 08:21:47 -- common/autotest_common.sh@407 -- $ xtrace_disable 00:09:34.672 08:21:47 -- common/autotest_common.sh@10 -- $ set +x 00:09:34.672 08:21:47 -- common/autobuild_common.sh@495 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:09:34.673 08:21:47 -- common/autobuild_common.sh@497 -- $ start_monitor_resources 00:09:34.673 08:21:47 -- pm/common@17 -- $ local monitor 00:09:34.673 08:21:47 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:34.673 08:21:47 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:34.673 08:21:47 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:34.673 08:21:47 -- pm/common@21 -- $ date +%s 00:09:34.673 08:21:47 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:34.673 08:21:47 -- pm/common@21 -- $ date +%s 00:09:34.673 08:21:47 -- pm/common@25 -- $ sleep 1 00:09:34.673 08:21:47 -- pm/common@21 -- $ date +%s 00:09:34.673 08:21:47 -- pm/common@21 -- $ date +%s 00:09:34.673 08:21:47 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1731828107 00:09:34.673 08:21:47 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1731828107 00:09:34.673 08:21:47 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1731828107 00:09:34.673 08:21:47 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1731828107 00:09:34.673 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1731828107_collect-cpu-load.pm.log 00:09:34.673 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1731828107_collect-cpu-temp.pm.log 00:09:34.673 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1731828107_collect-vmstat.pm.log 00:09:34.673 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1731828107_collect-bmc-pm.bmc.pm.log 00:09:35.612 08:21:48 -- common/autobuild_common.sh@498 -- $ trap stop_monitor_resources EXIT 00:09:35.612 08:21:48 -- spdk/autopackage.sh@10 -- $ [[ 0 -eq 1 ]] 00:09:35.612 08:21:48 -- spdk/autopackage.sh@14 -- $ timing_finish 00:09:35.612 08:21:48 -- common/autotest_common.sh@736 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:35.612 08:21:48 -- common/autotest_common.sh@737 -- $ [[ -x /usr/local/FlameGraph/flamegraph.pl ]] 00:09:35.612 08:21:48 -- common/autotest_common.sh@740 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:35.612 08:21:48 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:09:35.612 08:21:48 -- pm/common@29 -- $ signal_monitor_resources TERM 00:09:35.612 08:21:48 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:09:35.612 08:21:48 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:35.612 08:21:48 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:09:35.612 08:21:48 -- pm/common@44 -- $ pid=1012553 00:09:35.612 08:21:48 -- pm/common@50 -- $ kill -TERM 1012553 00:09:35.612 08:21:48 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:35.612 08:21:48 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:09:35.612 08:21:48 -- pm/common@44 -- $ pid=1012555 00:09:35.612 08:21:48 -- pm/common@50 -- $ kill -TERM 1012555 00:09:35.612 08:21:48 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:35.612 08:21:48 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:09:35.612 08:21:48 -- pm/common@44 -- $ pid=1012557 00:09:35.612 08:21:48 -- pm/common@50 -- $ kill -TERM 1012557 00:09:35.612 08:21:48 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:35.612 08:21:48 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:09:35.612 08:21:48 -- pm/common@44 -- $ pid=1012582 00:09:35.612 08:21:48 -- pm/common@50 -- $ sudo -E kill -TERM 1012582 00:09:35.612 + [[ -n 848671 ]] 00:09:35.612 + sudo kill 848671 00:09:35.621 [Pipeline] } 00:09:35.637 [Pipeline] // stage 00:09:35.641 [Pipeline] } 00:09:35.654 [Pipeline] // timeout 00:09:35.660 [Pipeline] } 00:09:35.674 [Pipeline] // catchError 00:09:35.679 [Pipeline] } 00:09:35.697 [Pipeline] // wrap 00:09:35.703 [Pipeline] } 00:09:35.718 [Pipeline] // catchError 00:09:35.728 [Pipeline] stage 00:09:35.730 [Pipeline] { (Epilogue) 00:09:35.743 [Pipeline] catchError 00:09:35.744 [Pipeline] { 00:09:35.759 [Pipeline] echo 00:09:35.761 Cleanup processes 00:09:35.768 [Pipeline] sh 00:09:36.055 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:36.055 1012706 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/sdr.cache 00:09:36.055 1013119 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:36.070 [Pipeline] sh 00:09:36.356 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:36.356 ++ grep -v 'sudo pgrep' 00:09:36.356 ++ awk '{print $1}' 00:09:36.356 + sudo kill -9 1012706 00:09:36.368 [Pipeline] sh 00:09:36.651 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:36.651 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:36.651 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:38.029 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:48.041 [Pipeline] sh 00:09:48.328 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:09:48.328 Artifacts sizes are good 00:09:48.345 [Pipeline] archiveArtifacts 00:09:48.353 Archiving artifacts 00:09:48.529 [Pipeline] sh 00:09:48.865 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:09:48.879 [Pipeline] cleanWs 00:09:48.889 [WS-CLEANUP] Deleting project workspace... 00:09:48.889 [WS-CLEANUP] Deferred wipeout is used... 00:09:48.896 [WS-CLEANUP] done 00:09:48.897 [Pipeline] } 00:09:48.914 [Pipeline] // catchError 00:09:48.926 [Pipeline] sh 00:09:49.210 + logger -p user.info -t JENKINS-CI 00:09:49.219 [Pipeline] } 00:09:49.233 [Pipeline] // stage 00:09:49.238 [Pipeline] } 00:09:49.253 [Pipeline] // node 00:09:49.260 [Pipeline] End of Pipeline 00:09:49.305 Finished: SUCCESS