00:00:00.001 Started by upstream project "autotest-spdk-master-vs-dpdk-v22.11" build number 2372 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3637 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.001 Started by timer 00:00:00.093 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.094 The recommended git tool is: git 00:00:00.094 using credential 00000000-0000-0000-0000-000000000002 00:00:00.096 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.127 Fetching changes from the remote Git repository 00:00:00.130 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.179 Using shallow fetch with depth 1 00:00:00.179 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.179 > git --version # timeout=10 00:00:00.223 > git --version # 'git version 2.39.2' 00:00:00.223 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.262 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.262 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:05.879 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:05.890 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:05.904 Checking out Revision b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf (FETCH_HEAD) 00:00:05.904 > git config core.sparsecheckout # timeout=10 00:00:05.914 > git read-tree -mu HEAD # timeout=10 00:00:05.929 > git checkout -f b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf # timeout=5 00:00:05.947 Commit message: "jenkins/jjb-config: Ignore OS version mismatch under freebsd" 00:00:05.947 > git rev-list --no-walk b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf # timeout=10 00:00:06.027 [Pipeline] Start of Pipeline 00:00:06.040 [Pipeline] library 00:00:06.041 Loading library shm_lib@master 00:00:06.042 Library shm_lib@master is cached. Copying from home. 00:00:06.056 [Pipeline] node 00:00:06.114 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:06.115 [Pipeline] { 00:00:06.123 [Pipeline] catchError 00:00:06.125 [Pipeline] { 00:00:06.136 [Pipeline] wrap 00:00:06.143 [Pipeline] { 00:00:06.151 [Pipeline] stage 00:00:06.153 [Pipeline] { (Prologue) 00:00:06.357 [Pipeline] sh 00:00:07.193 + logger -p user.info -t JENKINS-CI 00:00:07.225 [Pipeline] echo 00:00:07.226 Node: WFP20 00:00:07.233 [Pipeline] sh 00:00:07.574 [Pipeline] setCustomBuildProperty 00:00:07.586 [Pipeline] echo 00:00:07.588 Cleanup processes 00:00:07.595 [Pipeline] sh 00:00:07.890 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:07.890 4706 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:07.907 [Pipeline] sh 00:00:08.202 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:08.202 ++ grep -v 'sudo pgrep' 00:00:08.202 ++ awk '{print $1}' 00:00:08.202 + sudo kill -9 00:00:08.202 + true 00:00:08.217 [Pipeline] cleanWs 00:00:08.226 [WS-CLEANUP] Deleting project workspace... 00:00:08.226 [WS-CLEANUP] Deferred wipeout is used... 00:00:08.239 [WS-CLEANUP] done 00:00:08.242 [Pipeline] setCustomBuildProperty 00:00:08.256 [Pipeline] sh 00:00:08.545 + sudo git config --global --replace-all safe.directory '*' 00:00:08.658 [Pipeline] httpRequest 00:00:10.708 [Pipeline] echo 00:00:10.710 Sorcerer 10.211.164.20 is alive 00:00:10.721 [Pipeline] retry 00:00:10.723 [Pipeline] { 00:00:10.738 [Pipeline] httpRequest 00:00:10.742 HttpMethod: GET 00:00:10.743 URL: http://10.211.164.20/packages/jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:10.744 Sending request to url: http://10.211.164.20/packages/jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:10.763 Response Code: HTTP/1.1 200 OK 00:00:10.764 Success: Status code 200 is in the accepted range: 200,404 00:00:10.764 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:12.338 [Pipeline] } 00:00:12.355 [Pipeline] // retry 00:00:12.362 [Pipeline] sh 00:00:12.660 + tar --no-same-owner -xf jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:12.679 [Pipeline] httpRequest 00:00:13.456 [Pipeline] echo 00:00:13.457 Sorcerer 10.211.164.20 is alive 00:00:13.464 [Pipeline] retry 00:00:13.466 [Pipeline] { 00:00:13.477 [Pipeline] httpRequest 00:00:13.482 HttpMethod: GET 00:00:13.483 URL: http://10.211.164.20/packages/spdk_83e8405e4c25408c010ba2b9e02ce45e2347370c.tar.gz 00:00:13.484 Sending request to url: http://10.211.164.20/packages/spdk_83e8405e4c25408c010ba2b9e02ce45e2347370c.tar.gz 00:00:13.508 Response Code: HTTP/1.1 200 OK 00:00:13.508 Success: Status code 200 is in the accepted range: 200,404 00:00:13.508 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_83e8405e4c25408c010ba2b9e02ce45e2347370c.tar.gz 00:01:59.682 [Pipeline] } 00:01:59.698 [Pipeline] // retry 00:01:59.704 [Pipeline] sh 00:02:00.002 + tar --no-same-owner -xf spdk_83e8405e4c25408c010ba2b9e02ce45e2347370c.tar.gz 00:02:02.578 [Pipeline] sh 00:02:02.868 + git -C spdk log --oneline -n5 00:02:02.868 83e8405e4 nvmf/fc: Qpair disconnect callback: Serialize FC delete connection & close qpair process 00:02:02.868 0eab4c6fb nvmf/fc: Validate the ctrlr pointer inside nvmf_fc_req_bdev_abort() 00:02:02.868 4bcab9fb9 correct kick for CQ full case 00:02:02.868 8531656d3 test/nvmf: Interrupt test for local pcie nvme device 00:02:02.868 318515b44 nvme/perf: interrupt mode support for pcie controller 00:02:02.889 [Pipeline] withCredentials 00:02:02.901 > git --version # timeout=10 00:02:02.915 > git --version # 'git version 2.39.2' 00:02:02.943 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:02:02.945 [Pipeline] { 00:02:02.954 [Pipeline] retry 00:02:02.956 [Pipeline] { 00:02:02.972 [Pipeline] sh 00:02:03.497 + git ls-remote http://dpdk.org/git/dpdk-stable v22.11.4 00:02:03.770 [Pipeline] } 00:02:03.787 [Pipeline] // retry 00:02:03.791 [Pipeline] } 00:02:03.806 [Pipeline] // withCredentials 00:02:03.814 [Pipeline] httpRequest 00:02:04.131 [Pipeline] echo 00:02:04.132 Sorcerer 10.211.164.20 is alive 00:02:04.141 [Pipeline] retry 00:02:04.143 [Pipeline] { 00:02:04.156 [Pipeline] httpRequest 00:02:04.160 HttpMethod: GET 00:02:04.161 URL: http://10.211.164.20/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:02:04.162 Sending request to url: http://10.211.164.20/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:02:04.165 Response Code: HTTP/1.1 200 OK 00:02:04.165 Success: Status code 200 is in the accepted range: 200,404 00:02:04.166 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:02:06.422 [Pipeline] } 00:02:06.440 [Pipeline] // retry 00:02:06.447 [Pipeline] sh 00:02:06.736 + tar --no-same-owner -xf dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:02:08.132 [Pipeline] sh 00:02:08.420 + git -C dpdk log --oneline -n5 00:02:08.420 caf0f5d395 version: 22.11.4 00:02:08.420 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:02:08.420 dc9c799c7d vhost: fix missing spinlock unlock 00:02:08.420 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:02:08.420 6ef77f2a5e net/gve: fix RX buffer size alignment 00:02:08.431 [Pipeline] } 00:02:08.444 [Pipeline] // stage 00:02:08.453 [Pipeline] stage 00:02:08.455 [Pipeline] { (Prepare) 00:02:08.473 [Pipeline] writeFile 00:02:08.487 [Pipeline] sh 00:02:08.775 + logger -p user.info -t JENKINS-CI 00:02:08.786 [Pipeline] sh 00:02:09.074 + logger -p user.info -t JENKINS-CI 00:02:09.086 [Pipeline] sh 00:02:09.377 + cat autorun-spdk.conf 00:02:09.377 SPDK_RUN_FUNCTIONAL_TEST=1 00:02:09.377 SPDK_TEST_FUZZER_SHORT=1 00:02:09.377 SPDK_TEST_FUZZER=1 00:02:09.377 SPDK_TEST_SETUP=1 00:02:09.377 SPDK_RUN_UBSAN=1 00:02:09.377 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:02:09.377 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:09.386 RUN_NIGHTLY=1 00:02:09.391 [Pipeline] readFile 00:02:09.424 [Pipeline] withEnv 00:02:09.426 [Pipeline] { 00:02:09.438 [Pipeline] sh 00:02:09.722 + set -ex 00:02:09.722 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:02:09.722 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:09.722 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:09.722 ++ SPDK_TEST_FUZZER_SHORT=1 00:02:09.722 ++ SPDK_TEST_FUZZER=1 00:02:09.722 ++ SPDK_TEST_SETUP=1 00:02:09.722 ++ SPDK_RUN_UBSAN=1 00:02:09.722 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:02:09.722 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:09.722 ++ RUN_NIGHTLY=1 00:02:09.722 + case $SPDK_TEST_NVMF_NICS in 00:02:09.722 + DRIVERS= 00:02:09.722 + [[ -n '' ]] 00:02:09.722 + exit 0 00:02:09.732 [Pipeline] } 00:02:09.746 [Pipeline] // withEnv 00:02:09.751 [Pipeline] } 00:02:09.765 [Pipeline] // stage 00:02:09.772 [Pipeline] catchError 00:02:09.774 [Pipeline] { 00:02:09.785 [Pipeline] timeout 00:02:09.786 Timeout set to expire in 30 min 00:02:09.788 [Pipeline] { 00:02:09.801 [Pipeline] stage 00:02:09.803 [Pipeline] { (Tests) 00:02:09.817 [Pipeline] sh 00:02:10.108 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:10.108 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:10.108 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:02:10.108 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:02:10.108 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:10.108 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:02:10.108 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:02:10.108 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:02:10.108 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:02:10.108 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:02:10.108 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:02:10.108 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:10.108 + source /etc/os-release 00:02:10.108 ++ NAME='Fedora Linux' 00:02:10.108 ++ VERSION='39 (Cloud Edition)' 00:02:10.108 ++ ID=fedora 00:02:10.108 ++ VERSION_ID=39 00:02:10.108 ++ VERSION_CODENAME= 00:02:10.108 ++ PLATFORM_ID=platform:f39 00:02:10.108 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:02:10.108 ++ ANSI_COLOR='0;38;2;60;110;180' 00:02:10.108 ++ LOGO=fedora-logo-icon 00:02:10.108 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:02:10.108 ++ HOME_URL=https://fedoraproject.org/ 00:02:10.108 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:02:10.108 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:02:10.109 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:02:10.109 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:02:10.109 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:02:10.109 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:02:10.109 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:02:10.109 ++ SUPPORT_END=2024-11-12 00:02:10.109 ++ VARIANT='Cloud Edition' 00:02:10.109 ++ VARIANT_ID=cloud 00:02:10.109 + uname -a 00:02:10.109 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:02:10.109 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:02:13.404 Hugepages 00:02:13.404 node hugesize free / total 00:02:13.404 node0 1048576kB 0 / 0 00:02:13.404 node0 2048kB 0 / 0 00:02:13.404 node1 1048576kB 0 / 0 00:02:13.404 node1 2048kB 0 / 0 00:02:13.404 00:02:13.404 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:13.404 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:02:13.404 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:02:13.404 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:02:13.404 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:02:13.404 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:02:13.404 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:02:13.404 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:02:13.404 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:02:13.404 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:02:13.404 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:02:13.404 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:02:13.404 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:02:13.404 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:02:13.404 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:02:13.404 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:02:13.404 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:02:13.404 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:02:13.404 + rm -f /tmp/spdk-ld-path 00:02:13.404 + source autorun-spdk.conf 00:02:13.404 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:13.404 ++ SPDK_TEST_FUZZER_SHORT=1 00:02:13.404 ++ SPDK_TEST_FUZZER=1 00:02:13.404 ++ SPDK_TEST_SETUP=1 00:02:13.404 ++ SPDK_RUN_UBSAN=1 00:02:13.404 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:02:13.404 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:13.404 ++ RUN_NIGHTLY=1 00:02:13.404 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:02:13.404 + [[ -n '' ]] 00:02:13.404 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:13.404 + for M in /var/spdk/build-*-manifest.txt 00:02:13.404 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:02:13.404 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:13.404 + for M in /var/spdk/build-*-manifest.txt 00:02:13.404 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:02:13.404 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:13.404 + for M in /var/spdk/build-*-manifest.txt 00:02:13.404 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:02:13.404 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:13.404 ++ uname 00:02:13.404 + [[ Linux == \L\i\n\u\x ]] 00:02:13.404 + sudo dmesg -T 00:02:13.404 + sudo dmesg --clear 00:02:13.404 + dmesg_pid=6180 00:02:13.404 + [[ Fedora Linux == FreeBSD ]] 00:02:13.404 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:13.404 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:13.405 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:02:13.405 + sudo dmesg -Tw 00:02:13.405 + [[ -x /usr/src/fio-static/fio ]] 00:02:13.405 + export FIO_BIN=/usr/src/fio-static/fio 00:02:13.405 + FIO_BIN=/usr/src/fio-static/fio 00:02:13.405 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:02:13.405 + [[ ! -v VFIO_QEMU_BIN ]] 00:02:13.405 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:02:13.405 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:13.405 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:13.405 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:02:13.405 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:13.405 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:13.405 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:13.405 04:18:52 -- common/autotest_common.sh@1692 -- $ [[ n == y ]] 00:02:13.405 04:18:52 -- spdk/autorun.sh@20 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:13.405 04:18:52 -- short-fuzz-phy-autotest/autorun-spdk.conf@1 -- $ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:13.405 04:18:52 -- short-fuzz-phy-autotest/autorun-spdk.conf@2 -- $ SPDK_TEST_FUZZER_SHORT=1 00:02:13.405 04:18:52 -- short-fuzz-phy-autotest/autorun-spdk.conf@3 -- $ SPDK_TEST_FUZZER=1 00:02:13.405 04:18:52 -- short-fuzz-phy-autotest/autorun-spdk.conf@4 -- $ SPDK_TEST_SETUP=1 00:02:13.405 04:18:52 -- short-fuzz-phy-autotest/autorun-spdk.conf@5 -- $ SPDK_RUN_UBSAN=1 00:02:13.405 04:18:52 -- short-fuzz-phy-autotest/autorun-spdk.conf@6 -- $ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:02:13.405 04:18:52 -- short-fuzz-phy-autotest/autorun-spdk.conf@7 -- $ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:13.405 04:18:52 -- short-fuzz-phy-autotest/autorun-spdk.conf@8 -- $ RUN_NIGHTLY=1 00:02:13.405 04:18:52 -- spdk/autorun.sh@22 -- $ trap 'timing_finish || exit 1' EXIT 00:02:13.405 04:18:52 -- spdk/autorun.sh@25 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autobuild.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:13.405 04:18:52 -- common/autotest_common.sh@1692 -- $ [[ n == y ]] 00:02:13.405 04:18:52 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:13.405 04:18:52 -- scripts/common.sh@15 -- $ shopt -s extglob 00:02:13.405 04:18:52 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:02:13.405 04:18:52 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:13.405 04:18:52 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:13.405 04:18:52 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:13.405 04:18:52 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:13.405 04:18:52 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:13.405 04:18:52 -- paths/export.sh@5 -- $ export PATH 00:02:13.405 04:18:52 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:13.405 04:18:52 -- common/autobuild_common.sh@485 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:13.405 04:18:52 -- common/autobuild_common.sh@486 -- $ date +%s 00:02:13.405 04:18:52 -- common/autobuild_common.sh@486 -- $ mktemp -dt spdk_1731813532.XXXXXX 00:02:13.405 04:18:52 -- common/autobuild_common.sh@486 -- $ SPDK_WORKSPACE=/tmp/spdk_1731813532.fFJVNK 00:02:13.405 04:18:52 -- common/autobuild_common.sh@488 -- $ [[ -n '' ]] 00:02:13.405 04:18:52 -- common/autobuild_common.sh@492 -- $ '[' -n v22.11.4 ']' 00:02:13.405 04:18:52 -- common/autobuild_common.sh@493 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:13.405 04:18:52 -- common/autobuild_common.sh@493 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:02:13.405 04:18:52 -- common/autobuild_common.sh@499 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:02:13.405 04:18:52 -- common/autobuild_common.sh@501 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:02:13.405 04:18:52 -- common/autobuild_common.sh@502 -- $ get_config_params 00:02:13.405 04:18:52 -- common/autotest_common.sh@409 -- $ xtrace_disable 00:02:13.405 04:18:52 -- common/autotest_common.sh@10 -- $ set +x 00:02:13.405 04:18:52 -- common/autobuild_common.sh@502 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:02:13.405 04:18:52 -- common/autobuild_common.sh@504 -- $ start_monitor_resources 00:02:13.405 04:18:52 -- pm/common@17 -- $ local monitor 00:02:13.405 04:18:52 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.405 04:18:52 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.405 04:18:52 -- pm/common@21 -- $ date +%s 00:02:13.405 04:18:52 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.405 04:18:52 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.405 04:18:52 -- pm/common@21 -- $ date +%s 00:02:13.405 04:18:52 -- pm/common@25 -- $ sleep 1 00:02:13.405 04:18:52 -- pm/common@21 -- $ date +%s 00:02:13.405 04:18:52 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1731813532 00:02:13.405 04:18:52 -- pm/common@21 -- $ date +%s 00:02:13.405 04:18:52 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1731813532 00:02:13.405 04:18:52 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1731813532 00:02:13.405 04:18:52 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1731813532 00:02:13.665 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1731813532_collect-cpu-load.pm.log 00:02:13.665 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1731813532_collect-vmstat.pm.log 00:02:13.665 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1731813532_collect-cpu-temp.pm.log 00:02:13.665 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1731813532_collect-bmc-pm.bmc.pm.log 00:02:14.605 04:18:53 -- common/autobuild_common.sh@505 -- $ trap stop_monitor_resources EXIT 00:02:14.605 04:18:53 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:02:14.605 04:18:53 -- spdk/autobuild.sh@12 -- $ umask 022 00:02:14.605 04:18:53 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:14.605 04:18:53 -- spdk/autobuild.sh@16 -- $ date -u 00:02:14.605 Sun Nov 17 03:18:53 AM UTC 2024 00:02:14.605 04:18:53 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:02:14.605 v25.01-pre-189-g83e8405e4 00:02:14.605 04:18:53 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:02:14.605 04:18:53 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:02:14.605 04:18:53 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:02:14.605 04:18:53 -- common/autotest_common.sh@1105 -- $ '[' 3 -le 1 ']' 00:02:14.605 04:18:53 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:02:14.605 04:18:53 -- common/autotest_common.sh@10 -- $ set +x 00:02:14.605 ************************************ 00:02:14.605 START TEST ubsan 00:02:14.605 ************************************ 00:02:14.605 04:18:53 ubsan -- common/autotest_common.sh@1129 -- $ echo 'using ubsan' 00:02:14.605 using ubsan 00:02:14.605 00:02:14.605 real 0m0.001s 00:02:14.605 user 0m0.001s 00:02:14.605 sys 0m0.000s 00:02:14.605 04:18:53 ubsan -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:02:14.605 04:18:53 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:02:14.605 ************************************ 00:02:14.605 END TEST ubsan 00:02:14.605 ************************************ 00:02:14.605 04:18:53 -- spdk/autobuild.sh@27 -- $ '[' -n v22.11.4 ']' 00:02:14.605 04:18:53 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:02:14.605 04:18:53 -- common/autobuild_common.sh@442 -- $ run_test build_native_dpdk _build_native_dpdk 00:02:14.605 04:18:53 -- common/autotest_common.sh@1105 -- $ '[' 2 -le 1 ']' 00:02:14.605 04:18:53 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:02:14.605 04:18:53 -- common/autotest_common.sh@10 -- $ set +x 00:02:14.605 ************************************ 00:02:14.605 START TEST build_native_dpdk 00:02:14.605 ************************************ 00:02:14.605 04:18:53 build_native_dpdk -- common/autotest_common.sh@1129 -- $ _build_native_dpdk 00:02:14.605 04:18:53 build_native_dpdk -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:02:14.605 04:18:53 build_native_dpdk -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:02:14.605 04:18:53 build_native_dpdk -- common/autobuild_common.sh@50 -- $ local compiler_version 00:02:14.605 04:18:53 build_native_dpdk -- common/autobuild_common.sh@51 -- $ local compiler 00:02:14.605 04:18:53 build_native_dpdk -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:02:14.605 04:18:53 build_native_dpdk -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:02:14.605 04:18:53 build_native_dpdk -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:02:14.605 04:18:53 build_native_dpdk -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:02:14.605 04:18:53 build_native_dpdk -- common/autobuild_common.sh@61 -- $ CC=gcc 00:02:14.605 04:18:53 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:02:14.606 caf0f5d395 version: 22.11.4 00:02:14.606 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:02:14.606 dc9c799c7d vhost: fix missing spinlock unlock 00:02:14.606 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:02:14.606 6ef77f2a5e net/gve: fix RX buffer size alignment 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@87 -- $ dpdk_ver=22.11.4 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@168 -- $ uname -s 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:02:14.606 04:18:53 build_native_dpdk -- common/autobuild_common.sh@169 -- $ lt 22.11.4 21.11.0 00:02:14.606 04:18:53 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 22.11.4 '<' 21.11.0 00:02:14.606 04:18:53 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:14.606 04:18:53 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:14.606 04:18:53 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:14.606 04:18:53 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:14.606 04:18:53 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:14.606 04:18:53 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:14.606 04:18:53 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:02:14.606 04:18:53 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:14.606 04:18:53 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:14.866 04:18:53 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:14.866 04:18:53 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:14.866 04:18:53 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:02:14.866 04:18:53 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:14.866 04:18:53 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:14.866 04:18:53 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 22 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@353 -- $ local d=22 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@355 -- $ echo 22 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=22 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 21 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@353 -- $ local d=21 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@355 -- $ echo 21 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=21 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@367 -- $ return 1 00:02:14.867 04:18:53 build_native_dpdk -- common/autobuild_common.sh@173 -- $ patch -p1 00:02:14.867 patching file config/rte_config.h 00:02:14.867 Hunk #1 succeeded at 60 (offset 1 line). 00:02:14.867 04:18:53 build_native_dpdk -- common/autobuild_common.sh@176 -- $ lt 22.11.4 24.07.0 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 22.11.4 '<' 24.07.0 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 22 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@353 -- $ local d=22 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@355 -- $ echo 22 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=22 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@368 -- $ return 0 00:02:14.867 04:18:53 build_native_dpdk -- common/autobuild_common.sh@177 -- $ patch -p1 00:02:14.867 patching file lib/pcapng/rte_pcapng.c 00:02:14.867 Hunk #1 succeeded at 110 (offset -18 lines). 00:02:14.867 04:18:53 build_native_dpdk -- common/autobuild_common.sh@179 -- $ ge 22.11.4 24.07.0 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@376 -- $ cmp_versions 22.11.4 '>=' 24.07.0 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=>=' 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@348 -- $ : 1 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 22 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@353 -- $ local d=22 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@355 -- $ echo 22 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=22 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:02:14.867 04:18:53 build_native_dpdk -- scripts/common.sh@368 -- $ return 1 00:02:14.867 04:18:53 build_native_dpdk -- common/autobuild_common.sh@183 -- $ dpdk_kmods=false 00:02:14.867 04:18:53 build_native_dpdk -- common/autobuild_common.sh@184 -- $ uname -s 00:02:14.867 04:18:53 build_native_dpdk -- common/autobuild_common.sh@184 -- $ '[' Linux = FreeBSD ']' 00:02:14.867 04:18:53 build_native_dpdk -- common/autobuild_common.sh@188 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:02:14.867 04:18:53 build_native_dpdk -- common/autobuild_common.sh@188 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:02:21.448 The Meson build system 00:02:21.448 Version: 1.5.0 00:02:21.448 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:21.448 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:02:21.448 Build type: native build 00:02:21.448 Program cat found: YES (/usr/bin/cat) 00:02:21.448 Project name: DPDK 00:02:21.448 Project version: 22.11.4 00:02:21.448 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:02:21.448 C linker for the host machine: gcc ld.bfd 2.40-14 00:02:21.448 Host machine cpu family: x86_64 00:02:21.448 Host machine cpu: x86_64 00:02:21.448 Message: ## Building in Developer Mode ## 00:02:21.448 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:21.448 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:02:21.448 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:02:21.449 Program objdump found: YES (/usr/bin/objdump) 00:02:21.449 Program python3 found: YES (/usr/bin/python3) 00:02:21.449 Program cat found: YES (/usr/bin/cat) 00:02:21.449 config/meson.build:83: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:02:21.449 Checking for size of "void *" : 8 00:02:21.449 Checking for size of "void *" : 8 (cached) 00:02:21.449 Library m found: YES 00:02:21.449 Library numa found: YES 00:02:21.449 Has header "numaif.h" : YES 00:02:21.449 Library fdt found: NO 00:02:21.449 Library execinfo found: NO 00:02:21.449 Has header "execinfo.h" : YES 00:02:21.449 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:21.449 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:21.449 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:21.449 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:21.449 Run-time dependency openssl found: YES 3.1.1 00:02:21.449 Run-time dependency libpcap found: YES 1.10.4 00:02:21.449 Has header "pcap.h" with dependency libpcap: YES 00:02:21.449 Compiler for C supports arguments -Wcast-qual: YES 00:02:21.449 Compiler for C supports arguments -Wdeprecated: YES 00:02:21.449 Compiler for C supports arguments -Wformat: YES 00:02:21.449 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:21.449 Compiler for C supports arguments -Wformat-security: NO 00:02:21.449 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:21.449 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:21.449 Compiler for C supports arguments -Wnested-externs: YES 00:02:21.449 Compiler for C supports arguments -Wold-style-definition: YES 00:02:21.449 Compiler for C supports arguments -Wpointer-arith: YES 00:02:21.449 Compiler for C supports arguments -Wsign-compare: YES 00:02:21.449 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:21.449 Compiler for C supports arguments -Wundef: YES 00:02:21.449 Compiler for C supports arguments -Wwrite-strings: YES 00:02:21.449 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:21.449 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:21.449 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:21.449 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:21.449 Compiler for C supports arguments -mavx512f: YES 00:02:21.449 Checking if "AVX512 checking" compiles: YES 00:02:21.449 Fetching value of define "__SSE4_2__" : 1 00:02:21.449 Fetching value of define "__AES__" : 1 00:02:21.449 Fetching value of define "__AVX__" : 1 00:02:21.449 Fetching value of define "__AVX2__" : 1 00:02:21.449 Fetching value of define "__AVX512BW__" : 1 00:02:21.449 Fetching value of define "__AVX512CD__" : 1 00:02:21.449 Fetching value of define "__AVX512DQ__" : 1 00:02:21.449 Fetching value of define "__AVX512F__" : 1 00:02:21.449 Fetching value of define "__AVX512VL__" : 1 00:02:21.449 Fetching value of define "__PCLMUL__" : 1 00:02:21.449 Fetching value of define "__RDRND__" : 1 00:02:21.449 Fetching value of define "__RDSEED__" : 1 00:02:21.449 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:21.449 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:21.449 Message: lib/kvargs: Defining dependency "kvargs" 00:02:21.449 Message: lib/telemetry: Defining dependency "telemetry" 00:02:21.449 Checking for function "getentropy" : YES 00:02:21.449 Message: lib/eal: Defining dependency "eal" 00:02:21.449 Message: lib/ring: Defining dependency "ring" 00:02:21.449 Message: lib/rcu: Defining dependency "rcu" 00:02:21.449 Message: lib/mempool: Defining dependency "mempool" 00:02:21.449 Message: lib/mbuf: Defining dependency "mbuf" 00:02:21.449 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:21.449 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:21.449 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:21.449 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:21.449 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:21.449 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:21.449 Compiler for C supports arguments -mpclmul: YES 00:02:21.449 Compiler for C supports arguments -maes: YES 00:02:21.449 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:21.449 Compiler for C supports arguments -mavx512bw: YES 00:02:21.449 Compiler for C supports arguments -mavx512dq: YES 00:02:21.449 Compiler for C supports arguments -mavx512vl: YES 00:02:21.449 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:21.449 Compiler for C supports arguments -mavx2: YES 00:02:21.449 Compiler for C supports arguments -mavx: YES 00:02:21.449 Message: lib/net: Defining dependency "net" 00:02:21.449 Message: lib/meter: Defining dependency "meter" 00:02:21.449 Message: lib/ethdev: Defining dependency "ethdev" 00:02:21.449 Message: lib/pci: Defining dependency "pci" 00:02:21.449 Message: lib/cmdline: Defining dependency "cmdline" 00:02:21.449 Message: lib/metrics: Defining dependency "metrics" 00:02:21.449 Message: lib/hash: Defining dependency "hash" 00:02:21.449 Message: lib/timer: Defining dependency "timer" 00:02:21.449 Fetching value of define "__AVX2__" : 1 (cached) 00:02:21.449 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:21.449 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:21.449 Fetching value of define "__AVX512CD__" : 1 (cached) 00:02:21.449 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:21.449 Message: lib/acl: Defining dependency "acl" 00:02:21.449 Message: lib/bbdev: Defining dependency "bbdev" 00:02:21.449 Message: lib/bitratestats: Defining dependency "bitratestats" 00:02:21.449 Run-time dependency libelf found: YES 0.191 00:02:21.449 Message: lib/bpf: Defining dependency "bpf" 00:02:21.449 Message: lib/cfgfile: Defining dependency "cfgfile" 00:02:21.449 Message: lib/compressdev: Defining dependency "compressdev" 00:02:21.449 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:21.449 Message: lib/distributor: Defining dependency "distributor" 00:02:21.449 Message: lib/efd: Defining dependency "efd" 00:02:21.449 Message: lib/eventdev: Defining dependency "eventdev" 00:02:21.449 Message: lib/gpudev: Defining dependency "gpudev" 00:02:21.449 Message: lib/gro: Defining dependency "gro" 00:02:21.449 Message: lib/gso: Defining dependency "gso" 00:02:21.449 Message: lib/ip_frag: Defining dependency "ip_frag" 00:02:21.449 Message: lib/jobstats: Defining dependency "jobstats" 00:02:21.449 Message: lib/latencystats: Defining dependency "latencystats" 00:02:21.449 Message: lib/lpm: Defining dependency "lpm" 00:02:21.449 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:21.449 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:21.449 Fetching value of define "__AVX512IFMA__" : (undefined) 00:02:21.449 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:02:21.449 Message: lib/member: Defining dependency "member" 00:02:21.449 Message: lib/pcapng: Defining dependency "pcapng" 00:02:21.449 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:21.449 Message: lib/power: Defining dependency "power" 00:02:21.449 Message: lib/rawdev: Defining dependency "rawdev" 00:02:21.449 Message: lib/regexdev: Defining dependency "regexdev" 00:02:21.449 Message: lib/dmadev: Defining dependency "dmadev" 00:02:21.449 Message: lib/rib: Defining dependency "rib" 00:02:21.449 Message: lib/reorder: Defining dependency "reorder" 00:02:21.449 Message: lib/sched: Defining dependency "sched" 00:02:21.449 Message: lib/security: Defining dependency "security" 00:02:21.449 Message: lib/stack: Defining dependency "stack" 00:02:21.449 Has header "linux/userfaultfd.h" : YES 00:02:21.449 Message: lib/vhost: Defining dependency "vhost" 00:02:21.449 Message: lib/ipsec: Defining dependency "ipsec" 00:02:21.449 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:21.449 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:21.449 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:21.449 Message: lib/fib: Defining dependency "fib" 00:02:21.449 Message: lib/port: Defining dependency "port" 00:02:21.449 Message: lib/pdump: Defining dependency "pdump" 00:02:21.449 Message: lib/table: Defining dependency "table" 00:02:21.449 Message: lib/pipeline: Defining dependency "pipeline" 00:02:21.449 Message: lib/graph: Defining dependency "graph" 00:02:21.449 Message: lib/node: Defining dependency "node" 00:02:21.449 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:21.449 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:21.449 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:21.449 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:21.449 Compiler for C supports arguments -Wno-sign-compare: YES 00:02:21.449 Compiler for C supports arguments -Wno-unused-value: YES 00:02:21.449 Compiler for C supports arguments -Wno-format: YES 00:02:21.449 Compiler for C supports arguments -Wno-format-security: YES 00:02:21.449 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:02:22.399 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:02:22.399 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:02:22.399 Compiler for C supports arguments -Wno-unused-parameter: YES 00:02:22.399 Fetching value of define "__AVX2__" : 1 (cached) 00:02:22.399 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:22.399 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:22.399 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:22.399 Compiler for C supports arguments -mavx512bw: YES (cached) 00:02:22.399 Compiler for C supports arguments -march=skylake-avx512: YES 00:02:22.399 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:02:22.399 Program doxygen found: YES (/usr/local/bin/doxygen) 00:02:22.399 Configuring doxy-api.conf using configuration 00:02:22.399 Program sphinx-build found: NO 00:02:22.399 Configuring rte_build_config.h using configuration 00:02:22.399 Message: 00:02:22.399 ================= 00:02:22.399 Applications Enabled 00:02:22.399 ================= 00:02:22.399 00:02:22.399 apps: 00:02:22.399 dumpcap, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, test-crypto-perf, 00:02:22.399 test-eventdev, test-fib, test-flow-perf, test-gpudev, test-pipeline, test-pmd, test-regex, test-sad, 00:02:22.399 test-security-perf, 00:02:22.399 00:02:22.399 Message: 00:02:22.399 ================= 00:02:22.399 Libraries Enabled 00:02:22.399 ================= 00:02:22.399 00:02:22.399 libs: 00:02:22.399 kvargs, telemetry, eal, ring, rcu, mempool, mbuf, net, 00:02:22.399 meter, ethdev, pci, cmdline, metrics, hash, timer, acl, 00:02:22.399 bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, efd, 00:02:22.399 eventdev, gpudev, gro, gso, ip_frag, jobstats, latencystats, lpm, 00:02:22.399 member, pcapng, power, rawdev, regexdev, dmadev, rib, reorder, 00:02:22.399 sched, security, stack, vhost, ipsec, fib, port, pdump, 00:02:22.399 table, pipeline, graph, node, 00:02:22.399 00:02:22.399 Message: 00:02:22.399 =============== 00:02:22.399 Drivers Enabled 00:02:22.399 =============== 00:02:22.399 00:02:22.399 common: 00:02:22.399 00:02:22.399 bus: 00:02:22.399 pci, vdev, 00:02:22.399 mempool: 00:02:22.399 ring, 00:02:22.399 dma: 00:02:22.400 00:02:22.400 net: 00:02:22.400 i40e, 00:02:22.400 raw: 00:02:22.400 00:02:22.400 crypto: 00:02:22.400 00:02:22.400 compress: 00:02:22.400 00:02:22.400 regex: 00:02:22.400 00:02:22.400 vdpa: 00:02:22.400 00:02:22.400 event: 00:02:22.400 00:02:22.400 baseband: 00:02:22.400 00:02:22.400 gpu: 00:02:22.400 00:02:22.400 00:02:22.400 Message: 00:02:22.400 ================= 00:02:22.400 Content Skipped 00:02:22.400 ================= 00:02:22.400 00:02:22.400 apps: 00:02:22.400 00:02:22.400 libs: 00:02:22.400 kni: explicitly disabled via build config (deprecated lib) 00:02:22.400 flow_classify: explicitly disabled via build config (deprecated lib) 00:02:22.400 00:02:22.400 drivers: 00:02:22.400 common/cpt: not in enabled drivers build config 00:02:22.400 common/dpaax: not in enabled drivers build config 00:02:22.400 common/iavf: not in enabled drivers build config 00:02:22.400 common/idpf: not in enabled drivers build config 00:02:22.400 common/mvep: not in enabled drivers build config 00:02:22.400 common/octeontx: not in enabled drivers build config 00:02:22.400 bus/auxiliary: not in enabled drivers build config 00:02:22.400 bus/dpaa: not in enabled drivers build config 00:02:22.400 bus/fslmc: not in enabled drivers build config 00:02:22.400 bus/ifpga: not in enabled drivers build config 00:02:22.400 bus/vmbus: not in enabled drivers build config 00:02:22.400 common/cnxk: not in enabled drivers build config 00:02:22.400 common/mlx5: not in enabled drivers build config 00:02:22.400 common/qat: not in enabled drivers build config 00:02:22.400 common/sfc_efx: not in enabled drivers build config 00:02:22.400 mempool/bucket: not in enabled drivers build config 00:02:22.400 mempool/cnxk: not in enabled drivers build config 00:02:22.400 mempool/dpaa: not in enabled drivers build config 00:02:22.400 mempool/dpaa2: not in enabled drivers build config 00:02:22.400 mempool/octeontx: not in enabled drivers build config 00:02:22.400 mempool/stack: not in enabled drivers build config 00:02:22.400 dma/cnxk: not in enabled drivers build config 00:02:22.400 dma/dpaa: not in enabled drivers build config 00:02:22.400 dma/dpaa2: not in enabled drivers build config 00:02:22.400 dma/hisilicon: not in enabled drivers build config 00:02:22.400 dma/idxd: not in enabled drivers build config 00:02:22.400 dma/ioat: not in enabled drivers build config 00:02:22.400 dma/skeleton: not in enabled drivers build config 00:02:22.400 net/af_packet: not in enabled drivers build config 00:02:22.400 net/af_xdp: not in enabled drivers build config 00:02:22.400 net/ark: not in enabled drivers build config 00:02:22.400 net/atlantic: not in enabled drivers build config 00:02:22.400 net/avp: not in enabled drivers build config 00:02:22.400 net/axgbe: not in enabled drivers build config 00:02:22.400 net/bnx2x: not in enabled drivers build config 00:02:22.400 net/bnxt: not in enabled drivers build config 00:02:22.400 net/bonding: not in enabled drivers build config 00:02:22.400 net/cnxk: not in enabled drivers build config 00:02:22.400 net/cxgbe: not in enabled drivers build config 00:02:22.400 net/dpaa: not in enabled drivers build config 00:02:22.400 net/dpaa2: not in enabled drivers build config 00:02:22.400 net/e1000: not in enabled drivers build config 00:02:22.400 net/ena: not in enabled drivers build config 00:02:22.400 net/enetc: not in enabled drivers build config 00:02:22.400 net/enetfec: not in enabled drivers build config 00:02:22.400 net/enic: not in enabled drivers build config 00:02:22.400 net/failsafe: not in enabled drivers build config 00:02:22.400 net/fm10k: not in enabled drivers build config 00:02:22.400 net/gve: not in enabled drivers build config 00:02:22.400 net/hinic: not in enabled drivers build config 00:02:22.400 net/hns3: not in enabled drivers build config 00:02:22.400 net/iavf: not in enabled drivers build config 00:02:22.400 net/ice: not in enabled drivers build config 00:02:22.400 net/idpf: not in enabled drivers build config 00:02:22.400 net/igc: not in enabled drivers build config 00:02:22.400 net/ionic: not in enabled drivers build config 00:02:22.400 net/ipn3ke: not in enabled drivers build config 00:02:22.400 net/ixgbe: not in enabled drivers build config 00:02:22.400 net/kni: not in enabled drivers build config 00:02:22.400 net/liquidio: not in enabled drivers build config 00:02:22.400 net/mana: not in enabled drivers build config 00:02:22.400 net/memif: not in enabled drivers build config 00:02:22.400 net/mlx4: not in enabled drivers build config 00:02:22.400 net/mlx5: not in enabled drivers build config 00:02:22.400 net/mvneta: not in enabled drivers build config 00:02:22.400 net/mvpp2: not in enabled drivers build config 00:02:22.400 net/netvsc: not in enabled drivers build config 00:02:22.400 net/nfb: not in enabled drivers build config 00:02:22.400 net/nfp: not in enabled drivers build config 00:02:22.400 net/ngbe: not in enabled drivers build config 00:02:22.400 net/null: not in enabled drivers build config 00:02:22.400 net/octeontx: not in enabled drivers build config 00:02:22.400 net/octeon_ep: not in enabled drivers build config 00:02:22.400 net/pcap: not in enabled drivers build config 00:02:22.400 net/pfe: not in enabled drivers build config 00:02:22.400 net/qede: not in enabled drivers build config 00:02:22.400 net/ring: not in enabled drivers build config 00:02:22.400 net/sfc: not in enabled drivers build config 00:02:22.400 net/softnic: not in enabled drivers build config 00:02:22.400 net/tap: not in enabled drivers build config 00:02:22.400 net/thunderx: not in enabled drivers build config 00:02:22.400 net/txgbe: not in enabled drivers build config 00:02:22.400 net/vdev_netvsc: not in enabled drivers build config 00:02:22.400 net/vhost: not in enabled drivers build config 00:02:22.400 net/virtio: not in enabled drivers build config 00:02:22.400 net/vmxnet3: not in enabled drivers build config 00:02:22.400 raw/cnxk_bphy: not in enabled drivers build config 00:02:22.400 raw/cnxk_gpio: not in enabled drivers build config 00:02:22.400 raw/dpaa2_cmdif: not in enabled drivers build config 00:02:22.400 raw/ifpga: not in enabled drivers build config 00:02:22.400 raw/ntb: not in enabled drivers build config 00:02:22.400 raw/skeleton: not in enabled drivers build config 00:02:22.400 crypto/armv8: not in enabled drivers build config 00:02:22.400 crypto/bcmfs: not in enabled drivers build config 00:02:22.400 crypto/caam_jr: not in enabled drivers build config 00:02:22.400 crypto/ccp: not in enabled drivers build config 00:02:22.400 crypto/cnxk: not in enabled drivers build config 00:02:22.400 crypto/dpaa_sec: not in enabled drivers build config 00:02:22.400 crypto/dpaa2_sec: not in enabled drivers build config 00:02:22.400 crypto/ipsec_mb: not in enabled drivers build config 00:02:22.400 crypto/mlx5: not in enabled drivers build config 00:02:22.400 crypto/mvsam: not in enabled drivers build config 00:02:22.400 crypto/nitrox: not in enabled drivers build config 00:02:22.400 crypto/null: not in enabled drivers build config 00:02:22.400 crypto/octeontx: not in enabled drivers build config 00:02:22.400 crypto/openssl: not in enabled drivers build config 00:02:22.400 crypto/scheduler: not in enabled drivers build config 00:02:22.400 crypto/uadk: not in enabled drivers build config 00:02:22.400 crypto/virtio: not in enabled drivers build config 00:02:22.400 compress/isal: not in enabled drivers build config 00:02:22.400 compress/mlx5: not in enabled drivers build config 00:02:22.400 compress/octeontx: not in enabled drivers build config 00:02:22.400 compress/zlib: not in enabled drivers build config 00:02:22.400 regex/mlx5: not in enabled drivers build config 00:02:22.400 regex/cn9k: not in enabled drivers build config 00:02:22.400 vdpa/ifc: not in enabled drivers build config 00:02:22.400 vdpa/mlx5: not in enabled drivers build config 00:02:22.400 vdpa/sfc: not in enabled drivers build config 00:02:22.400 event/cnxk: not in enabled drivers build config 00:02:22.400 event/dlb2: not in enabled drivers build config 00:02:22.400 event/dpaa: not in enabled drivers build config 00:02:22.400 event/dpaa2: not in enabled drivers build config 00:02:22.400 event/dsw: not in enabled drivers build config 00:02:22.400 event/opdl: not in enabled drivers build config 00:02:22.400 event/skeleton: not in enabled drivers build config 00:02:22.400 event/sw: not in enabled drivers build config 00:02:22.400 event/octeontx: not in enabled drivers build config 00:02:22.400 baseband/acc: not in enabled drivers build config 00:02:22.400 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:02:22.400 baseband/fpga_lte_fec: not in enabled drivers build config 00:02:22.400 baseband/la12xx: not in enabled drivers build config 00:02:22.400 baseband/null: not in enabled drivers build config 00:02:22.400 baseband/turbo_sw: not in enabled drivers build config 00:02:22.400 gpu/cuda: not in enabled drivers build config 00:02:22.400 00:02:22.400 00:02:22.400 Build targets in project: 311 00:02:22.400 00:02:22.400 DPDK 22.11.4 00:02:22.400 00:02:22.400 User defined options 00:02:22.400 libdir : lib 00:02:22.400 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:22.400 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:02:22.400 c_link_args : 00:02:22.400 enable_docs : false 00:02:22.400 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:02:22.400 enable_kmods : false 00:02:22.400 machine : native 00:02:22.400 tests : false 00:02:22.400 00:02:22.400 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:22.400 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:02:22.400 04:19:00 build_native_dpdk -- common/autobuild_common.sh@192 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:02:22.400 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:22.400 [1/740] Generating lib/rte_kvargs_def with a custom command 00:02:22.400 [2/740] Generating lib/rte_kvargs_mingw with a custom command 00:02:22.400 [3/740] Generating lib/rte_telemetry_mingw with a custom command 00:02:22.400 [4/740] Generating lib/rte_telemetry_def with a custom command 00:02:22.400 [5/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:22.400 [6/740] Generating lib/rte_rcu_mingw with a custom command 00:02:22.400 [7/740] Generating lib/rte_rcu_def with a custom command 00:02:22.401 [8/740] Generating lib/rte_eal_mingw with a custom command 00:02:22.401 [9/740] Generating lib/rte_eal_def with a custom command 00:02:22.401 [10/740] Generating lib/rte_ring_def with a custom command 00:02:22.401 [11/740] Generating lib/rte_mbuf_def with a custom command 00:02:22.401 [12/740] Generating lib/rte_net_def with a custom command 00:02:22.401 [13/740] Generating lib/rte_mbuf_mingw with a custom command 00:02:22.401 [14/740] Generating lib/rte_mempool_mingw with a custom command 00:02:22.401 [15/740] Generating lib/rte_mempool_def with a custom command 00:02:22.401 [16/740] Generating lib/rte_meter_mingw with a custom command 00:02:22.401 [17/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:22.401 [18/740] Generating lib/rte_ring_mingw with a custom command 00:02:22.401 [19/740] Generating lib/rte_meter_def with a custom command 00:02:22.401 [20/740] Generating lib/rte_net_mingw with a custom command 00:02:22.401 [21/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:22.668 [22/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:22.668 [23/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:22.668 [24/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:22.668 [25/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:22.668 [26/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_log.c.o 00:02:22.668 [27/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:22.668 [28/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:22.668 [29/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:22.668 [30/740] Generating lib/rte_ethdev_def with a custom command 00:02:22.668 [31/740] Generating lib/rte_ethdev_mingw with a custom command 00:02:22.668 [32/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:22.668 [33/740] Generating lib/rte_pci_def with a custom command 00:02:22.668 [34/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:22.668 [35/740] Generating lib/rte_pci_mingw with a custom command 00:02:22.668 [36/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:22.668 [37/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:22.668 [38/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:22.668 [39/740] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:22.668 [40/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:22.668 [41/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:22.668 [42/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:22.668 [43/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:22.668 [44/740] Linking static target lib/librte_kvargs.a 00:02:22.668 [45/740] Generating lib/rte_cmdline_def with a custom command 00:02:22.668 [46/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:22.668 [47/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:22.668 [48/740] Generating lib/rte_cmdline_mingw with a custom command 00:02:22.668 [49/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:22.668 [50/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:22.668 [51/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:22.668 [52/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:22.668 [53/740] Generating lib/rte_metrics_def with a custom command 00:02:22.668 [54/740] Generating lib/rte_metrics_mingw with a custom command 00:02:22.668 [55/740] Generating lib/rte_hash_def with a custom command 00:02:22.668 [56/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:22.668 [57/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:22.668 [58/740] Generating lib/rte_hash_mingw with a custom command 00:02:22.668 [59/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:22.668 [60/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:22.668 [61/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:22.668 [62/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:22.668 [63/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:22.668 [64/740] Generating lib/rte_timer_def with a custom command 00:02:22.668 [65/740] Generating lib/rte_timer_mingw with a custom command 00:02:22.668 [66/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:22.668 [67/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:22.668 [68/740] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:22.668 [69/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:22.668 [70/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:22.668 [71/740] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:22.668 [72/740] Generating lib/rte_bbdev_def with a custom command 00:02:22.668 [73/740] Generating lib/rte_bitratestats_mingw with a custom command 00:02:22.668 [74/740] Generating lib/rte_acl_mingw with a custom command 00:02:22.668 [75/740] Generating lib/rte_bbdev_mingw with a custom command 00:02:22.668 [76/740] Generating lib/rte_acl_def with a custom command 00:02:22.668 [77/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:22.668 [78/740] Linking static target lib/librte_pci.a 00:02:22.668 [79/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:22.668 [80/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:22.668 [81/740] Generating lib/rte_bitratestats_def with a custom command 00:02:22.668 [82/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:22.668 [83/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:22.668 [84/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:22.668 [85/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:22.668 [86/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:22.668 [87/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:22.668 [88/740] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:22.668 [89/740] Generating lib/rte_bpf_mingw with a custom command 00:02:22.668 [90/740] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:22.668 [91/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:22.668 [92/740] Generating lib/rte_bpf_def with a custom command 00:02:22.668 [93/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:22.668 [94/740] Generating lib/rte_cfgfile_mingw with a custom command 00:02:22.668 [95/740] Generating lib/rte_cfgfile_def with a custom command 00:02:22.668 [96/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:22.668 [97/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:22.668 [98/740] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:22.668 [99/740] Linking static target lib/librte_meter.a 00:02:22.668 [100/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:22.668 [101/740] Generating lib/rte_compressdev_def with a custom command 00:02:22.668 [102/740] Generating lib/rte_compressdev_mingw with a custom command 00:02:22.668 [103/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:22.668 [104/740] Linking static target lib/librte_ring.a 00:02:22.668 [105/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:22.668 [106/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:22.668 [107/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_log.c.o 00:02:22.931 [108/740] Generating lib/rte_cryptodev_def with a custom command 00:02:22.931 [109/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:22.931 [110/740] Generating lib/rte_cryptodev_mingw with a custom command 00:02:22.931 [111/740] Generating lib/rte_efd_def with a custom command 00:02:22.931 [112/740] Generating lib/rte_efd_mingw with a custom command 00:02:22.931 [113/740] Generating lib/rte_distributor_def with a custom command 00:02:22.931 [114/740] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:22.931 [115/740] Generating lib/rte_distributor_mingw with a custom command 00:02:22.931 [116/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:22.931 [117/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:22.931 [118/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:22.931 [119/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:22.931 [120/740] Generating lib/rte_eventdev_mingw with a custom command 00:02:22.931 [121/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:22.931 [122/740] Generating lib/rte_eventdev_def with a custom command 00:02:22.931 [123/740] Generating lib/rte_gpudev_def with a custom command 00:02:22.931 [124/740] Generating lib/rte_gpudev_mingw with a custom command 00:02:22.932 [125/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:02:22.932 [126/740] Generating lib/rte_gro_mingw with a custom command 00:02:22.932 [127/740] Generating lib/rte_gro_def with a custom command 00:02:22.932 [128/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:22.932 [129/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:22.932 [130/740] Generating lib/rte_gso_def with a custom command 00:02:22.932 [131/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:22.932 [132/740] Generating lib/rte_gso_mingw with a custom command 00:02:22.932 [133/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:22.932 [134/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:22.932 [135/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:23.194 [136/740] Generating lib/rte_ip_frag_mingw with a custom command 00:02:23.195 [137/740] Generating lib/rte_ip_frag_def with a custom command 00:02:23.195 [138/740] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.195 [139/740] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.195 [140/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:23.195 [141/740] Generating lib/rte_jobstats_def with a custom command 00:02:23.195 [142/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:23.195 [143/740] Generating lib/rte_jobstats_mingw with a custom command 00:02:23.195 [144/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:23.195 [145/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:23.195 [146/740] Linking target lib/librte_kvargs.so.23.0 00:02:23.195 [147/740] Generating lib/rte_latencystats_def with a custom command 00:02:23.195 [148/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:23.195 [149/740] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:23.195 [150/740] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:23.195 [151/740] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.195 [152/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:23.195 [153/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:23.195 [154/740] Generating lib/rte_latencystats_mingw with a custom command 00:02:23.195 [155/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:23.195 [156/740] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:02:23.195 [157/740] Generating lib/rte_lpm_def with a custom command 00:02:23.195 [158/740] Generating lib/rte_lpm_mingw with a custom command 00:02:23.195 [159/740] Linking static target lib/librte_cfgfile.a 00:02:23.195 [160/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:23.195 [161/740] Generating lib/rte_member_def with a custom command 00:02:23.195 [162/740] Generating lib/rte_member_mingw with a custom command 00:02:23.195 [163/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:23.195 [164/740] Generating lib/rte_pcapng_def with a custom command 00:02:23.195 [165/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:23.195 [166/740] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:02:23.195 [167/740] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:23.195 [168/740] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.195 [169/740] Generating lib/rte_pcapng_mingw with a custom command 00:02:23.195 [170/740] Linking static target lib/librte_jobstats.a 00:02:23.195 [171/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:23.195 [172/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:23.195 [173/740] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:23.195 [174/740] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:23.195 [175/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:23.195 [176/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:23.195 [177/740] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:02:23.195 [178/740] Linking static target lib/librte_timer.a 00:02:23.195 [179/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:23.195 [180/740] Generating lib/rte_power_def with a custom command 00:02:23.195 [181/740] Generating lib/rte_power_mingw with a custom command 00:02:23.195 [182/740] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:23.195 [183/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:02:23.195 [184/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:23.195 [185/740] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:23.457 [186/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:23.457 [187/740] Generating lib/rte_rawdev_mingw with a custom command 00:02:23.457 [188/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:23.457 [189/740] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:23.457 [190/740] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:02:23.457 [191/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:02:23.457 [192/740] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:23.457 [193/740] Generating lib/rte_rawdev_def with a custom command 00:02:23.457 [194/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:23.457 [195/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:02:23.457 [196/740] Linking static target lib/librte_cmdline.a 00:02:23.457 [197/740] Generating lib/rte_regexdev_def with a custom command 00:02:23.457 [198/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:02:23.457 [199/740] Generating lib/rte_regexdev_mingw with a custom command 00:02:23.457 [200/740] Linking static target lib/librte_telemetry.a 00:02:23.457 [201/740] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:23.457 [202/740] Linking static target lib/librte_metrics.a 00:02:23.457 [203/740] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:23.457 [204/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:23.457 [205/740] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:23.457 [206/740] Generating lib/rte_dmadev_def with a custom command 00:02:23.457 [207/740] Generating lib/rte_rib_mingw with a custom command 00:02:23.457 [208/740] Generating lib/rte_reorder_mingw with a custom command 00:02:23.457 [209/740] Generating lib/rte_rib_def with a custom command 00:02:23.457 [210/740] Linking static target lib/librte_net.a 00:02:23.457 [211/740] Generating lib/rte_reorder_def with a custom command 00:02:23.457 [212/740] Generating lib/rte_dmadev_mingw with a custom command 00:02:23.457 [213/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:23.457 [214/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:23.457 [215/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:02:23.457 [216/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:02:23.457 [217/740] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:23.457 [218/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:23.457 [219/740] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:02:23.457 [220/740] Generating lib/rte_sched_def with a custom command 00:02:23.457 [221/740] Linking static target lib/librte_bitratestats.a 00:02:23.457 [222/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:23.457 [223/740] Generating lib/rte_sched_mingw with a custom command 00:02:23.457 [224/740] Generating lib/rte_security_def with a custom command 00:02:23.457 [225/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:23.458 [226/740] Generating lib/rte_security_mingw with a custom command 00:02:23.458 [227/740] Generating lib/rte_stack_def with a custom command 00:02:23.458 [228/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:23.458 [229/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:02:23.458 [230/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:23.458 [231/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:23.458 [232/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:23.458 [233/740] Generating lib/rte_stack_mingw with a custom command 00:02:23.458 [234/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:02:23.458 [235/740] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:02:23.458 [236/740] Generating lib/rte_vhost_def with a custom command 00:02:23.458 [237/740] Generating lib/rte_vhost_mingw with a custom command 00:02:23.458 [238/740] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:02:23.458 [239/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:02:23.458 [240/740] Generating lib/rte_ipsec_mingw with a custom command 00:02:23.458 [241/740] Generating lib/rte_ipsec_def with a custom command 00:02:23.458 [242/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:02:23.458 [243/740] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:02:23.458 [244/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:23.458 [245/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:02:23.458 [246/740] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:02:23.458 [247/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:02:23.458 [248/740] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:02:23.458 [249/740] Compiling C object lib/librte_power.a.p/power_rte_power_empty_poll.c.o 00:02:23.458 [250/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:23.458 [251/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:02:23.721 [252/740] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:02:23.721 [253/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:02:23.721 [254/740] Generating lib/rte_fib_def with a custom command 00:02:23.721 [255/740] Generating lib/rte_fib_mingw with a custom command 00:02:23.721 [256/740] Linking static target lib/librte_stack.a 00:02:23.721 [257/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:02:23.721 [258/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:23.721 [259/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:02:23.721 [260/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:02:23.721 [261/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:23.721 [262/740] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:23.721 [263/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:02:23.721 [264/740] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:02:23.721 [265/740] Generating lib/rte_port_def with a custom command 00:02:23.721 [266/740] Linking static target lib/librte_compressdev.a 00:02:23.721 [267/740] Generating lib/rte_port_mingw with a custom command 00:02:23.721 [268/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:02:23.721 [269/740] Generating lib/rte_pdump_def with a custom command 00:02:23.721 [270/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:23.721 [271/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:23.721 [272/740] Generating lib/rte_pdump_mingw with a custom command 00:02:23.721 [273/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:02:23.722 [274/740] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.722 [275/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:02:23.722 [276/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:02:23.722 [277/740] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:02:23.722 [278/740] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.722 [279/740] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:23.722 [280/740] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:02:23.722 [281/740] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:02:23.722 [282/740] Linking static target lib/librte_rcu.a 00:02:23.722 [283/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:23.722 [284/740] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.722 [285/740] Linking static target lib/librte_rawdev.a 00:02:23.722 [286/740] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:02:23.722 [287/740] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:02:23.722 [288/740] Linking static target lib/librte_mempool.a 00:02:23.722 [289/740] Linking static target lib/librte_bbdev.a 00:02:23.722 [290/740] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.722 [291/740] Generating lib/rte_table_def with a custom command 00:02:23.985 [292/740] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:02:23.985 [293/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:02:23.985 [294/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:02:23.985 [295/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:02:23.985 [296/740] Generating lib/rte_table_mingw with a custom command 00:02:23.985 [297/740] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:23.985 [298/740] Linking static target lib/librte_gpudev.a 00:02:23.985 [299/740] Linking static target lib/librte_gro.a 00:02:23.985 [300/740] Linking static target lib/librte_dmadev.a 00:02:23.985 [301/740] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:02:23.985 [302/740] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:02:23.985 [303/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:02:23.985 [304/740] Linking static target lib/librte_latencystats.a 00:02:23.985 [305/740] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:02:23.985 [306/740] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.985 [307/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:02:23.985 [308/740] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:02:23.985 [309/740] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.985 [310/740] Linking static target lib/librte_gso.a 00:02:23.985 [311/740] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:23.985 [312/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:02:23.985 [313/740] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.985 [314/740] Generating lib/rte_pipeline_def with a custom command 00:02:23.985 [315/740] Compiling C object lib/librte_power.a.p/power_rte_power_intel_uncore.c.o 00:02:23.985 [316/740] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:02:23.985 [317/740] Generating lib/rte_pipeline_mingw with a custom command 00:02:23.985 [318/740] Linking static target lib/member/libsketch_avx512_tmp.a 00:02:23.985 [319/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:02:23.985 [320/740] Linking static target lib/librte_distributor.a 00:02:23.985 [321/740] Generating lib/rte_graph_def with a custom command 00:02:23.985 [322/740] Linking target lib/librte_telemetry.so.23.0 00:02:23.985 [323/740] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:23.985 [324/740] Generating lib/rte_graph_mingw with a custom command 00:02:23.985 [325/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:02:23.985 [326/740] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.985 [327/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:02:23.985 [328/740] Linking static target lib/librte_ip_frag.a 00:02:23.985 [329/740] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:23.985 [330/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:02:23.985 [331/740] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:02:24.247 [332/740] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:02:24.247 [333/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:02:24.247 [334/740] Linking static target lib/librte_regexdev.a 00:02:24.247 [335/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:02:24.247 [336/740] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:24.247 [337/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:02:24.247 [338/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:24.247 [339/740] Compiling C object lib/librte_node.a.p/node_null.c.o 00:02:24.247 [340/740] Generating lib/rte_node_def with a custom command 00:02:24.247 [341/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:02:24.247 [342/740] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:24.247 [343/740] Linking static target lib/librte_eal.a 00:02:24.247 [344/740] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.247 [345/740] Generating lib/rte_node_mingw with a custom command 00:02:24.247 [346/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:02:24.247 [347/740] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.247 [348/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:02:24.247 [349/740] Generating drivers/rte_bus_pci_mingw with a custom command 00:02:24.247 [350/740] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:24.247 [351/740] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.247 [352/740] Generating drivers/rte_bus_pci_def with a custom command 00:02:24.247 [353/740] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:24.247 [354/740] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:02:24.247 [355/740] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.247 [356/740] Linking static target lib/librte_power.a 00:02:24.247 [357/740] Linking static target lib/librte_reorder.a 00:02:24.247 [358/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:24.247 [359/740] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:02:24.247 [360/740] Generating drivers/rte_bus_vdev_def with a custom command 00:02:24.247 [361/740] Generating drivers/rte_mempool_ring_def with a custom command 00:02:24.247 [362/740] Generating drivers/rte_mempool_ring_mingw with a custom command 00:02:24.247 [363/740] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:24.247 [364/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:24.247 [365/740] Generating drivers/rte_bus_vdev_mingw with a custom command 00:02:24.247 [366/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:02:24.247 [367/740] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:02:24.247 [368/740] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:24.247 [369/740] Linking static target lib/librte_pcapng.a 00:02:24.247 [370/740] Linking static target lib/librte_security.a 00:02:24.520 [371/740] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:02:24.520 [372/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:02:24.520 [373/740] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.520 [374/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:24.520 [375/740] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:02:24.520 [376/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:02:24.520 [377/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:02:24.520 [378/740] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:02:24.520 [379/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:24.520 [380/740] Linking static target lib/librte_mbuf.a 00:02:24.520 [381/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:02:24.520 [382/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:02:24.520 [383/740] Linking static target lib/librte_bpf.a 00:02:24.520 [384/740] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.520 [385/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:02:24.520 [386/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:02:24.520 [387/740] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.520 [388/740] Generating drivers/rte_net_i40e_def with a custom command 00:02:24.520 [389/740] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:02:24.520 [390/740] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:02:24.520 [391/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:24.520 [392/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:02:24.520 [393/740] Generating drivers/rte_net_i40e_mingw with a custom command 00:02:24.520 [394/740] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:24.520 [395/740] Linking static target lib/librte_lpm.a 00:02:24.520 [396/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:02:24.520 [397/740] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:02:24.520 [398/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:24.520 [399/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:24.520 [400/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:24.781 [401/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:02:24.781 [402/740] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:24.781 [403/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:02:24.781 [404/740] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:02:24.781 [405/740] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:02:24.781 [406/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:02:24.781 [407/740] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:02:24.781 [408/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:02:24.781 [409/740] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:02:24.781 [410/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:02:24.781 [411/740] Compiling C object lib/librte_node.a.p/node_log.c.o 00:02:24.781 [412/740] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.781 [413/740] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:02:24.781 [414/740] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.781 [415/740] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.781 [416/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:02:24.781 [417/740] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:02:24.781 [418/740] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:02:24.781 [419/740] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:02:24.781 [420/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:02:24.781 [421/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:02:24.781 [422/740] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:02:24.781 [423/740] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:02:24.781 [424/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:24.781 [425/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:02:24.781 [426/740] Linking static target lib/librte_graph.a 00:02:24.781 [427/740] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:02:24.781 [428/740] Linking static target lib/librte_rib.a 00:02:24.781 [429/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:02:24.781 [430/740] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:02:24.781 [431/740] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:02:24.781 [432/740] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.781 [433/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:02:24.781 [434/740] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.781 [435/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:02:24.781 [436/740] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:02:25.044 [437/740] Linking static target lib/librte_efd.a 00:02:25.044 [438/740] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:02:25.044 [439/740] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:02:25.044 [440/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:02:25.044 [441/740] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:02:25.044 [442/740] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:02:25.044 [443/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:25.044 [444/740] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:25.044 [445/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:02:25.044 [446/740] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.044 [447/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:02:25.044 [448/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:02:25.044 [449/740] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.044 [450/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:02:25.044 [451/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:02:25.044 [452/740] Linking static target lib/librte_fib.a 00:02:25.044 [453/740] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:02:25.313 [454/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:02:25.313 [455/740] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.313 [456/740] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.313 [457/740] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:25.313 [458/740] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.313 [459/740] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:25.313 [460/740] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.313 [461/740] Linking static target drivers/librte_bus_vdev.a 00:02:25.313 [462/740] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:02:25.313 [463/740] Compiling C object drivers/librte_bus_vdev.so.23.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:25.313 [464/740] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.313 [465/740] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:02:25.313 [466/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:02:25.313 [467/740] Linking static target lib/librte_pdump.a 00:02:25.313 [468/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:02:25.313 [469/740] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.313 [470/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:25.313 [471/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:25.313 [472/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:02:25.313 [473/740] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:25.313 [474/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:02:25.575 [475/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:02:25.575 [476/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:02:25.575 [477/740] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.575 [478/740] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:25.575 [479/740] Compiling C object drivers/librte_bus_pci.so.23.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:25.575 [480/740] Linking static target drivers/librte_bus_pci.a 00:02:25.575 [481/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:02:25.575 [482/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:02:25.575 [483/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:02:25.575 [484/740] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.575 [485/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:02:25.575 [486/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:02:25.575 [487/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:02:25.575 [488/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:02:25.575 [489/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:02:25.575 [490/740] Linking static target lib/librte_table.a 00:02:25.575 [491/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:02:25.575 [492/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:02:25.575 [493/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:02:25.835 [494/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:02:25.835 [495/740] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.835 [496/740] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.835 [497/740] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:02:25.835 [498/740] Generating symbol file lib/librte_telemetry.so.23.0.p/librte_telemetry.so.23.0.symbols 00:02:25.835 [499/740] Generating symbol file lib/librte_kvargs.so.23.0.p/librte_kvargs.so.23.0.symbols 00:02:25.835 [500/740] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:02:25.835 [501/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:02:25.835 [502/740] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.835 [503/740] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:02:25.835 [504/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:02:25.835 [505/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:02:25.835 [506/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:02:25.835 [507/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:02:25.835 [508/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:02:25.835 [509/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:02:25.835 [510/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:02:25.835 [511/740] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:02:25.835 [512/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:02:25.835 [513/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:02:25.835 [514/740] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:25.835 [515/740] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:25.835 [516/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:02:25.835 [517/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:02:25.835 [518/740] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.835 [519/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:02:25.835 [520/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:02:25.835 [521/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:02:25.835 [522/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:25.835 [523/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:02:25.835 [524/740] Linking static target lib/librte_cryptodev.a 00:02:25.835 [525/740] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:02:26.095 [526/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:02:26.095 [527/740] Linking static target lib/librte_sched.a 00:02:26.095 [528/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:02:26.095 [529/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:02:26.095 [530/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:02:26.095 [531/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:02:26.095 [532/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:02:26.095 [533/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:02:26.095 [534/740] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.095 [535/740] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:02:26.095 [536/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:02:26.095 [537/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:02:26.095 [538/740] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:02:26.095 [539/740] Linking static target lib/librte_node.a 00:02:26.095 [540/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:02:26.095 [541/740] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:26.095 [542/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:02:26.095 [543/740] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:26.095 [544/740] Compiling C object drivers/librte_mempool_ring.so.23.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:26.095 [545/740] Linking static target drivers/librte_mempool_ring.a 00:02:26.095 [546/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:02:26.095 [547/740] Linking static target lib/librte_ipsec.a 00:02:26.095 [548/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:26.095 [549/740] Linking static target lib/librte_ethdev.a 00:02:26.095 [550/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:02:26.095 [551/740] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.095 [552/740] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:02:26.095 [553/740] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:02:26.355 [554/740] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:02:26.355 [555/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:02:26.355 [556/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:02:26.355 [557/740] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:02:26.355 [558/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:02:26.355 [559/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:02:26.355 [560/740] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:02:26.355 [561/740] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:02:26.355 [562/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:02:26.355 [563/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:02:26.355 [564/740] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:02:26.355 [565/740] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:02:26.355 [566/740] Linking static target lib/librte_member.a 00:02:26.355 [567/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:02:26.355 [568/740] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:02:26.355 [569/740] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:02:26.355 [570/740] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.355 [571/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:02:26.355 [572/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:02:26.355 [573/740] Linking static target lib/librte_eventdev.a 00:02:26.355 [574/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:02:26.355 [575/740] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:02:26.355 [576/740] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:26.355 [577/740] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:02:26.355 [578/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:02:26.355 [579/740] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:02:26.356 [580/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:02:26.356 [581/740] Linking static target lib/librte_hash.a 00:02:26.356 [582/740] Linking static target lib/librte_port.a 00:02:26.356 [583/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:02:26.356 [584/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:02:26.356 [585/740] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:02:26.356 [586/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:02:26.356 [587/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx2.c.o 00:02:26.356 [588/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:02:26.616 [589/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:02:26.616 [590/740] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:02:26.616 [591/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:02:26.616 [592/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:02:26.616 [593/740] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:02:26.616 [594/740] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.616 [595/740] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.616 [596/740] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.616 [597/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:02:26.617 [598/740] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:02:26.617 [599/740] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:02:26.877 [600/740] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:02:26.877 [601/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:02:26.877 [602/740] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.877 [603/740] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:02:26.877 [604/740] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:02:26.877 [605/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:02:26.877 [606/740] Linking static target drivers/net/i40e/base/libi40e_base.a 00:02:26.877 [607/740] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:02:26.877 [608/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:02:27.137 [609/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_avx2.c.o 00:02:27.137 [610/740] Linking static target lib/librte_acl.a 00:02:27.137 [611/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:02:27.137 [612/740] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:02:27.397 [613/740] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.397 [614/740] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:02:27.397 [615/740] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.656 [616/740] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.656 [617/740] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:02:27.656 [618/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:02:27.915 [619/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:02:28.175 [620/740] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:02:28.435 [621/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:02:28.694 [622/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:02:28.694 [623/740] Linking static target drivers/libtmp_rte_net_i40e.a 00:02:28.953 [624/740] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:02:28.953 [625/740] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:28.953 [626/740] Compiling C object drivers/librte_net_i40e.so.23.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:29.214 [627/740] Linking static target drivers/librte_net_i40e.a 00:02:29.474 [628/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:29.734 [629/740] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.734 [630/740] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.992 [631/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:02:29.992 [632/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:02:30.252 [633/740] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:02:35.535 [634/740] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:35.795 [635/740] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:35.795 [636/740] Linking static target lib/librte_vhost.a 00:02:36.366 [637/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:36.627 [638/740] Linking static target lib/librte_pipeline.a 00:02:36.889 [639/740] Linking target app/dpdk-test-acl 00:02:36.889 [640/740] Linking target app/dpdk-pdump 00:02:36.889 [641/740] Linking target app/dpdk-test-gpudev 00:02:36.889 [642/740] Linking target app/dpdk-dumpcap 00:02:36.889 [643/740] Linking target app/dpdk-test-security-perf 00:02:36.889 [644/740] Linking target app/dpdk-test-regex 00:02:36.889 [645/740] Linking target app/dpdk-proc-info 00:02:36.889 [646/740] Linking target app/dpdk-test-crypto-perf 00:02:36.889 [647/740] Linking target app/dpdk-test-fib 00:02:36.889 [648/740] Linking target app/dpdk-test-sad 00:02:36.889 [649/740] Linking target app/dpdk-test-cmdline 00:02:36.889 [650/740] Linking target app/dpdk-test-pipeline 00:02:36.889 [651/740] Linking target app/dpdk-test-compress-perf 00:02:36.889 [652/740] Linking target app/dpdk-test-flow-perf 00:02:36.889 [653/740] Linking target app/dpdk-test-eventdev 00:02:36.889 [654/740] Linking target app/dpdk-test-bbdev 00:02:36.889 [655/740] Linking target app/dpdk-testpmd 00:02:38.274 [656/740] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.534 [657/740] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.534 [658/740] Linking target lib/librte_eal.so.23.0 00:02:38.794 [659/740] Generating symbol file lib/librte_eal.so.23.0.p/librte_eal.so.23.0.symbols 00:02:38.794 [660/740] Linking target lib/librte_timer.so.23.0 00:02:38.794 [661/740] Linking target lib/librte_ring.so.23.0 00:02:38.794 [662/740] Linking target lib/librte_jobstats.so.23.0 00:02:38.794 [663/740] Linking target lib/librte_pci.so.23.0 00:02:38.794 [664/740] Linking target lib/librte_cfgfile.so.23.0 00:02:38.794 [665/740] Linking target lib/librte_rawdev.so.23.0 00:02:38.794 [666/740] Linking target lib/librte_dmadev.so.23.0 00:02:38.794 [667/740] Linking target lib/librte_meter.so.23.0 00:02:38.794 [668/740] Linking target lib/librte_graph.so.23.0 00:02:38.794 [669/740] Linking target lib/librte_stack.so.23.0 00:02:38.794 [670/740] Linking target drivers/librte_bus_vdev.so.23.0 00:02:38.794 [671/740] Linking target lib/librte_acl.so.23.0 00:02:38.794 [672/740] Generating symbol file lib/librte_dmadev.so.23.0.p/librte_dmadev.so.23.0.symbols 00:02:38.794 [673/740] Generating symbol file lib/librte_graph.so.23.0.p/librte_graph.so.23.0.symbols 00:02:38.794 [674/740] Generating symbol file lib/librte_meter.so.23.0.p/librte_meter.so.23.0.symbols 00:02:38.794 [675/740] Generating symbol file lib/librte_acl.so.23.0.p/librte_acl.so.23.0.symbols 00:02:38.794 [676/740] Generating symbol file lib/librte_ring.so.23.0.p/librte_ring.so.23.0.symbols 00:02:38.794 [677/740] Generating symbol file lib/librte_pci.so.23.0.p/librte_pci.so.23.0.symbols 00:02:38.794 [678/740] Generating symbol file lib/librte_timer.so.23.0.p/librte_timer.so.23.0.symbols 00:02:38.794 [679/740] Generating symbol file drivers/librte_bus_vdev.so.23.0.p/librte_bus_vdev.so.23.0.symbols 00:02:39.066 [680/740] Linking target lib/librte_mempool.so.23.0 00:02:39.066 [681/740] Linking target lib/librte_rcu.so.23.0 00:02:39.066 [682/740] Linking target drivers/librte_bus_pci.so.23.0 00:02:39.066 [683/740] Generating symbol file lib/librte_mempool.so.23.0.p/librte_mempool.so.23.0.symbols 00:02:39.066 [684/740] Generating symbol file lib/librte_rcu.so.23.0.p/librte_rcu.so.23.0.symbols 00:02:39.066 [685/740] Generating symbol file drivers/librte_bus_pci.so.23.0.p/librte_bus_pci.so.23.0.symbols 00:02:39.066 [686/740] Linking target lib/librte_rib.so.23.0 00:02:39.066 [687/740] Linking target drivers/librte_mempool_ring.so.23.0 00:02:39.066 [688/740] Linking target lib/librte_mbuf.so.23.0 00:02:39.325 [689/740] Generating symbol file lib/librte_rib.so.23.0.p/librte_rib.so.23.0.symbols 00:02:39.325 [690/740] Generating symbol file lib/librte_mbuf.so.23.0.p/librte_mbuf.so.23.0.symbols 00:02:39.325 [691/740] Linking target lib/librte_fib.so.23.0 00:02:39.325 [692/740] Linking target lib/librte_bbdev.so.23.0 00:02:39.325 [693/740] Linking target lib/librte_gpudev.so.23.0 00:02:39.325 [694/740] Linking target lib/librte_net.so.23.0 00:02:39.325 [695/740] Linking target lib/librte_reorder.so.23.0 00:02:39.325 [696/740] Linking target lib/librte_regexdev.so.23.0 00:02:39.325 [697/740] Linking target lib/librte_compressdev.so.23.0 00:02:39.325 [698/740] Linking target lib/librte_distributor.so.23.0 00:02:39.325 [699/740] Linking target lib/librte_cryptodev.so.23.0 00:02:39.325 [700/740] Linking target lib/librte_sched.so.23.0 00:02:39.585 [701/740] Generating symbol file lib/librte_net.so.23.0.p/librte_net.so.23.0.symbols 00:02:39.585 [702/740] Generating symbol file lib/librte_cryptodev.so.23.0.p/librte_cryptodev.so.23.0.symbols 00:02:39.585 [703/740] Generating symbol file lib/librte_sched.so.23.0.p/librte_sched.so.23.0.symbols 00:02:39.585 [704/740] Linking target lib/librte_hash.so.23.0 00:02:39.585 [705/740] Linking target lib/librte_cmdline.so.23.0 00:02:39.585 [706/740] Linking target lib/librte_security.so.23.0 00:02:39.585 [707/740] Linking target lib/librte_ethdev.so.23.0 00:02:39.585 [708/740] Generating symbol file lib/librte_hash.so.23.0.p/librte_hash.so.23.0.symbols 00:02:39.585 [709/740] Generating symbol file lib/librte_ethdev.so.23.0.p/librte_ethdev.so.23.0.symbols 00:02:39.585 [710/740] Generating symbol file lib/librte_security.so.23.0.p/librte_security.so.23.0.symbols 00:02:39.585 [711/740] Linking target lib/librte_efd.so.23.0 00:02:39.585 [712/740] Linking target lib/librte_lpm.so.23.0 00:02:39.845 [713/740] Linking target lib/librte_member.so.23.0 00:02:39.845 [714/740] Linking target lib/librte_metrics.so.23.0 00:02:39.845 [715/740] Linking target lib/librte_ip_frag.so.23.0 00:02:39.845 [716/740] Linking target lib/librte_bpf.so.23.0 00:02:39.845 [717/740] Linking target lib/librte_pcapng.so.23.0 00:02:39.845 [718/740] Linking target lib/librte_gro.so.23.0 00:02:39.845 [719/740] Linking target lib/librte_gso.so.23.0 00:02:39.845 [720/740] Linking target lib/librte_power.so.23.0 00:02:39.845 [721/740] Linking target lib/librte_eventdev.so.23.0 00:02:39.845 [722/740] Linking target lib/librte_ipsec.so.23.0 00:02:39.845 [723/740] Linking target lib/librte_vhost.so.23.0 00:02:39.845 [724/740] Linking target drivers/librte_net_i40e.so.23.0 00:02:39.845 [725/740] Generating symbol file lib/librte_lpm.so.23.0.p/librte_lpm.so.23.0.symbols 00:02:39.845 [726/740] Generating symbol file lib/librte_metrics.so.23.0.p/librte_metrics.so.23.0.symbols 00:02:39.845 [727/740] Generating symbol file lib/librte_ip_frag.so.23.0.p/librte_ip_frag.so.23.0.symbols 00:02:39.845 [728/740] Generating symbol file lib/librte_bpf.so.23.0.p/librte_bpf.so.23.0.symbols 00:02:39.845 [729/740] Generating symbol file lib/librte_pcapng.so.23.0.p/librte_pcapng.so.23.0.symbols 00:02:39.845 [730/740] Generating symbol file lib/librte_eventdev.so.23.0.p/librte_eventdev.so.23.0.symbols 00:02:39.845 [731/740] Linking target lib/librte_node.so.23.0 00:02:39.845 [732/740] Linking target lib/librte_latencystats.so.23.0 00:02:39.845 [733/740] Linking target lib/librte_bitratestats.so.23.0 00:02:39.845 [734/740] Linking target lib/librte_pdump.so.23.0 00:02:39.845 [735/740] Linking target lib/librte_port.so.23.0 00:02:40.105 [736/740] Generating symbol file lib/librte_port.so.23.0.p/librte_port.so.23.0.symbols 00:02:40.105 [737/740] Linking target lib/librte_table.so.23.0 00:02:40.366 [738/740] Generating symbol file lib/librte_table.so.23.0.p/librte_table.so.23.0.symbols 00:02:41.749 [739/740] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:42.009 [740/740] Linking target lib/librte_pipeline.so.23.0 00:02:42.009 04:19:20 build_native_dpdk -- common/autobuild_common.sh@194 -- $ uname -s 00:02:42.009 04:19:20 build_native_dpdk -- common/autobuild_common.sh@194 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:42.009 04:19:20 build_native_dpdk -- common/autobuild_common.sh@207 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:02:42.009 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:42.009 [0/1] Installing files. 00:02:42.275 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/flow_classify.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/ipv4_rules_file.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:42.275 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:42.276 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.277 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:42.278 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/kni.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:42.279 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:42.280 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:42.280 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.280 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.281 Installing lib/librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.281 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.281 Installing lib/librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.281 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.281 Installing lib/librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.281 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.281 Installing lib/librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.281 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.281 Installing lib/librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.281 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.281 Installing lib/librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.546 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.547 Installing lib/librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.547 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.547 Installing lib/librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.547 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.547 Installing drivers/librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:42.547 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.547 Installing drivers/librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:42.547 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.547 Installing drivers/librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:42.547 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.547 Installing drivers/librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:42.547 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.547 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.547 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.547 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.547 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.547 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.547 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.547 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.547 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.547 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.547 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.547 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.547 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.547 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.547 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.547 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.547 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.547 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.548 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_empty_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_intel_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.549 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:42.550 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.551 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:42.551 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:42.551 Installing symlink pointing to librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.23 00:02:42.551 Installing symlink pointing to librte_kvargs.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:42.551 Installing symlink pointing to librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.23 00:02:42.551 Installing symlink pointing to librte_telemetry.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:42.551 Installing symlink pointing to librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.23 00:02:42.551 Installing symlink pointing to librte_eal.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:42.551 Installing symlink pointing to librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.23 00:02:42.551 Installing symlink pointing to librte_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:42.551 Installing symlink pointing to librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.23 00:02:42.551 Installing symlink pointing to librte_rcu.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:42.551 Installing symlink pointing to librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.23 00:02:42.551 Installing symlink pointing to librte_mempool.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:42.551 Installing symlink pointing to librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.23 00:02:42.551 Installing symlink pointing to librte_mbuf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:42.551 Installing symlink pointing to librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.23 00:02:42.551 Installing symlink pointing to librte_net.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:42.551 Installing symlink pointing to librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.23 00:02:42.551 Installing symlink pointing to librte_meter.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:42.551 Installing symlink pointing to librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.23 00:02:42.551 Installing symlink pointing to librte_ethdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:42.551 Installing symlink pointing to librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.23 00:02:42.551 Installing symlink pointing to librte_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:42.551 Installing symlink pointing to librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.23 00:02:42.551 Installing symlink pointing to librte_cmdline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:42.551 Installing symlink pointing to librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.23 00:02:42.551 Installing symlink pointing to librte_metrics.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:42.551 Installing symlink pointing to librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.23 00:02:42.551 Installing symlink pointing to librte_hash.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:42.551 Installing symlink pointing to librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.23 00:02:42.551 Installing symlink pointing to librte_timer.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:42.551 Installing symlink pointing to librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.23 00:02:42.551 Installing symlink pointing to librte_acl.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:42.551 Installing symlink pointing to librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.23 00:02:42.551 Installing symlink pointing to librte_bbdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:42.551 Installing symlink pointing to librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.23 00:02:42.551 Installing symlink pointing to librte_bitratestats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:42.551 Installing symlink pointing to librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.23 00:02:42.551 Installing symlink pointing to librte_bpf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:42.551 Installing symlink pointing to librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.23 00:02:42.551 Installing symlink pointing to librte_cfgfile.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:42.551 Installing symlink pointing to librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.23 00:02:42.551 Installing symlink pointing to librte_compressdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:42.551 Installing symlink pointing to librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.23 00:02:42.551 Installing symlink pointing to librte_cryptodev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:42.551 Installing symlink pointing to librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.23 00:02:42.551 Installing symlink pointing to librte_distributor.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:42.551 Installing symlink pointing to librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.23 00:02:42.551 Installing symlink pointing to librte_efd.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:42.551 Installing symlink pointing to librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.23 00:02:42.551 Installing symlink pointing to librte_eventdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:42.551 Installing symlink pointing to librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.23 00:02:42.551 Installing symlink pointing to librte_gpudev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:42.551 Installing symlink pointing to librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.23 00:02:42.551 Installing symlink pointing to librte_gro.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:42.551 Installing symlink pointing to librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.23 00:02:42.551 Installing symlink pointing to librte_gso.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:42.551 Installing symlink pointing to librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.23 00:02:42.551 Installing symlink pointing to librte_ip_frag.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:42.551 Installing symlink pointing to librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.23 00:02:42.551 Installing symlink pointing to librte_jobstats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:42.551 Installing symlink pointing to librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.23 00:02:42.551 Installing symlink pointing to librte_latencystats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:42.551 Installing symlink pointing to librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.23 00:02:42.551 Installing symlink pointing to librte_lpm.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:42.551 Installing symlink pointing to librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.23 00:02:42.551 Installing symlink pointing to librte_member.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:42.551 Installing symlink pointing to librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.23 00:02:42.551 Installing symlink pointing to librte_pcapng.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:42.551 Installing symlink pointing to librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.23 00:02:42.551 Installing symlink pointing to librte_power.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:42.551 Installing symlink pointing to librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.23 00:02:42.551 Installing symlink pointing to librte_rawdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:42.551 Installing symlink pointing to librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.23 00:02:42.551 Installing symlink pointing to librte_regexdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:42.551 Installing symlink pointing to librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.23 00:02:42.551 Installing symlink pointing to librte_dmadev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:42.552 Installing symlink pointing to librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.23 00:02:42.552 Installing symlink pointing to librte_rib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:42.552 Installing symlink pointing to librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.23 00:02:42.552 Installing symlink pointing to librte_reorder.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:42.552 Installing symlink pointing to librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.23 00:02:42.552 Installing symlink pointing to librte_sched.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:42.552 Installing symlink pointing to librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.23 00:02:42.552 Installing symlink pointing to librte_security.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:42.552 Installing symlink pointing to librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.23 00:02:42.552 Installing symlink pointing to librte_stack.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:42.552 Installing symlink pointing to librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.23 00:02:42.552 Installing symlink pointing to librte_vhost.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:42.552 './librte_bus_pci.so' -> 'dpdk/pmds-23.0/librte_bus_pci.so' 00:02:42.552 './librte_bus_pci.so.23' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23' 00:02:42.552 './librte_bus_pci.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23.0' 00:02:42.552 './librte_bus_vdev.so' -> 'dpdk/pmds-23.0/librte_bus_vdev.so' 00:02:42.552 './librte_bus_vdev.so.23' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23' 00:02:42.552 './librte_bus_vdev.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23.0' 00:02:42.552 './librte_mempool_ring.so' -> 'dpdk/pmds-23.0/librte_mempool_ring.so' 00:02:42.552 './librte_mempool_ring.so.23' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23' 00:02:42.552 './librte_mempool_ring.so.23.0' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23.0' 00:02:42.552 './librte_net_i40e.so' -> 'dpdk/pmds-23.0/librte_net_i40e.so' 00:02:42.552 './librte_net_i40e.so.23' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23' 00:02:42.552 './librte_net_i40e.so.23.0' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23.0' 00:02:42.552 Installing symlink pointing to librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.23 00:02:42.552 Installing symlink pointing to librte_ipsec.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:42.552 Installing symlink pointing to librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.23 00:02:42.552 Installing symlink pointing to librte_fib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:42.552 Installing symlink pointing to librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.23 00:02:42.552 Installing symlink pointing to librte_port.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:42.552 Installing symlink pointing to librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.23 00:02:42.552 Installing symlink pointing to librte_pdump.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:42.552 Installing symlink pointing to librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.23 00:02:42.552 Installing symlink pointing to librte_table.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:42.552 Installing symlink pointing to librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.23 00:02:42.552 Installing symlink pointing to librte_pipeline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:42.552 Installing symlink pointing to librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.23 00:02:42.552 Installing symlink pointing to librte_graph.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:42.552 Installing symlink pointing to librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.23 00:02:42.552 Installing symlink pointing to librte_node.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:42.552 Installing symlink pointing to librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so.23 00:02:42.552 Installing symlink pointing to librte_bus_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so 00:02:42.552 Installing symlink pointing to librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so.23 00:02:42.552 Installing symlink pointing to librte_bus_vdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so 00:02:42.552 Installing symlink pointing to librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so.23 00:02:42.552 Installing symlink pointing to librte_mempool_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so 00:02:42.552 Installing symlink pointing to librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so.23 00:02:42.552 Installing symlink pointing to librte_net_i40e.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so 00:02:42.552 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-23.0' 00:02:42.552 04:19:21 build_native_dpdk -- common/autobuild_common.sh@213 -- $ cat 00:02:42.552 04:19:21 build_native_dpdk -- common/autobuild_common.sh@218 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:42.552 00:02:42.552 real 0m27.993s 00:02:42.552 user 6m36.703s 00:02:42.552 sys 2m21.277s 00:02:42.552 04:19:21 build_native_dpdk -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:02:42.552 04:19:21 build_native_dpdk -- common/autotest_common.sh@10 -- $ set +x 00:02:42.552 ************************************ 00:02:42.552 END TEST build_native_dpdk 00:02:42.552 ************************************ 00:02:42.813 04:19:21 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:42.813 04:19:21 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:42.813 04:19:21 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:42.813 04:19:21 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:42.813 04:19:21 -- common/autobuild_common.sh@438 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:42.813 04:19:21 -- common/autotest_common.sh@1105 -- $ '[' 2 -le 1 ']' 00:02:42.813 04:19:21 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:02:42.813 04:19:21 -- common/autotest_common.sh@10 -- $ set +x 00:02:42.813 ************************************ 00:02:42.813 START TEST autobuild_llvm_precompile 00:02:42.813 ************************************ 00:02:42.813 04:19:21 autobuild_llvm_precompile -- common/autotest_common.sh@1129 -- $ _llvm_precompile 00:02:42.813 04:19:21 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ clang --version 00:02:43.074 04:19:21 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:02:43.074 Target: x86_64-redhat-linux-gnu 00:02:43.074 Thread model: posix 00:02:43.074 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:43.074 04:19:21 autobuild_llvm_precompile -- common/autobuild_common.sh@33 -- $ clang_num=17 00:02:43.074 04:19:21 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:02:43.074 04:19:21 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:02:43.074 04:19:21 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:02:43.074 04:19:21 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:02:43.074 04:19:21 autobuild_llvm_precompile -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:02:43.074 04:19:21 autobuild_llvm_precompile -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:43.074 04:19:21 autobuild_llvm_precompile -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:02:43.074 04:19:21 autobuild_llvm_precompile -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:02:43.074 04:19:21 autobuild_llvm_precompile -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:43.334 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:43.595 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:43.595 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:43.595 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:44.535 Using 'verbs' RDMA provider 00:03:00.829 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:15.735 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:15.735 Creating mk/config.mk...done. 00:03:15.735 Creating mk/cc.flags.mk...done. 00:03:15.735 Type 'make' to build. 00:03:15.735 00:03:15.735 real 0m32.315s 00:03:15.735 user 0m13.112s 00:03:15.735 sys 0m18.553s 00:03:15.735 04:19:53 autobuild_llvm_precompile -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:03:15.735 04:19:53 autobuild_llvm_precompile -- common/autotest_common.sh@10 -- $ set +x 00:03:15.735 ************************************ 00:03:15.735 END TEST autobuild_llvm_precompile 00:03:15.735 ************************************ 00:03:15.735 04:19:53 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:03:15.735 04:19:53 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:03:15.735 04:19:53 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:03:15.735 04:19:53 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:03:15.735 04:19:53 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:03:15.735 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:03:15.735 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:15.735 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:15.735 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:03:15.996 Using 'verbs' RDMA provider 00:03:29.171 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:41.401 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:41.401 Creating mk/config.mk...done. 00:03:41.401 Creating mk/cc.flags.mk...done. 00:03:41.401 Type 'make' to build. 00:03:41.401 04:20:19 -- spdk/autobuild.sh@70 -- $ run_test make make -j112 00:03:41.401 04:20:19 -- common/autotest_common.sh@1105 -- $ '[' 3 -le 1 ']' 00:03:41.402 04:20:19 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:03:41.402 04:20:19 -- common/autotest_common.sh@10 -- $ set +x 00:03:41.402 ************************************ 00:03:41.402 START TEST make 00:03:41.402 ************************************ 00:03:41.402 04:20:19 make -- common/autotest_common.sh@1129 -- $ make -j112 00:03:41.662 make[1]: Nothing to be done for 'all'. 00:03:43.573 The Meson build system 00:03:43.573 Version: 1.5.0 00:03:43.573 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:03:43.573 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:43.573 Build type: native build 00:03:43.573 Project name: libvfio-user 00:03:43.573 Project version: 0.0.1 00:03:43.573 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:03:43.573 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:03:43.573 Host machine cpu family: x86_64 00:03:43.573 Host machine cpu: x86_64 00:03:43.573 Run-time dependency threads found: YES 00:03:43.573 Library dl found: YES 00:03:43.573 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:03:43.573 Run-time dependency json-c found: YES 0.17 00:03:43.573 Run-time dependency cmocka found: YES 1.1.7 00:03:43.573 Program pytest-3 found: NO 00:03:43.573 Program flake8 found: NO 00:03:43.573 Program misspell-fixer found: NO 00:03:43.573 Program restructuredtext-lint found: NO 00:03:43.573 Program valgrind found: YES (/usr/bin/valgrind) 00:03:43.573 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:43.573 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:43.573 Compiler for C supports arguments -Wwrite-strings: YES 00:03:43.573 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:43.573 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:03:43.573 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:03:43.573 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:43.573 Build targets in project: 8 00:03:43.573 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:03:43.573 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:03:43.573 00:03:43.573 libvfio-user 0.0.1 00:03:43.573 00:03:43.573 User defined options 00:03:43.573 buildtype : debug 00:03:43.573 default_library: static 00:03:43.573 libdir : /usr/local/lib 00:03:43.573 00:03:43.573 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:43.573 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:43.573 [1/36] Compiling C object samples/null.p/null.c.o 00:03:43.573 [2/36] Compiling C object samples/lspci.p/lspci.c.o 00:03:43.573 [3/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:03:43.573 [4/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:03:43.573 [5/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:03:43.574 [6/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:03:43.574 [7/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:03:43.574 [8/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:03:43.574 [9/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:03:43.574 [10/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:03:43.574 [11/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:03:43.574 [12/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:03:43.574 [13/36] Compiling C object test/unit_tests.p/mocks.c.o 00:03:43.574 [14/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:03:43.574 [15/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:03:43.574 [16/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:03:43.574 [17/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:03:43.833 [18/36] Compiling C object samples/server.p/server.c.o 00:03:43.833 [19/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:03:43.833 [20/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:03:43.833 [21/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:03:43.833 [22/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:03:43.833 [23/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:03:43.833 [24/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:03:43.833 [25/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:03:43.833 [26/36] Compiling C object samples/client.p/client.c.o 00:03:43.833 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:03:43.833 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:03:43.833 [29/36] Linking target samples/client 00:03:43.833 [30/36] Linking static target lib/libvfio-user.a 00:03:43.833 [31/36] Linking target test/unit_tests 00:03:43.833 [32/36] Linking target samples/server 00:03:43.833 [33/36] Linking target samples/null 00:03:43.833 [34/36] Linking target samples/gpio-pci-idio-16 00:03:43.833 [35/36] Linking target samples/lspci 00:03:43.833 [36/36] Linking target samples/shadow_ioeventfd_server 00:03:43.833 INFO: autodetecting backend as ninja 00:03:43.833 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:43.833 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:44.405 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:44.405 ninja: no work to do. 00:03:56.634 CC lib/ut/ut.o 00:03:56.634 CC lib/log/log.o 00:03:56.634 CC lib/log/log_flags.o 00:03:56.634 CC lib/log/log_deprecated.o 00:03:56.634 CC lib/ut_mock/mock.o 00:03:56.894 LIB libspdk_log.a 00:03:56.894 LIB libspdk_ut_mock.a 00:03:56.894 LIB libspdk_ut.a 00:03:57.154 CXX lib/trace_parser/trace.o 00:03:57.154 CC lib/ioat/ioat.o 00:03:57.154 CC lib/dma/dma.o 00:03:57.154 CC lib/util/base64.o 00:03:57.154 CC lib/util/bit_array.o 00:03:57.154 CC lib/util/cpuset.o 00:03:57.154 CC lib/util/crc16.o 00:03:57.154 CC lib/util/crc32.o 00:03:57.154 CC lib/util/crc32c.o 00:03:57.154 CC lib/util/crc32_ieee.o 00:03:57.154 CC lib/util/crc64.o 00:03:57.154 CC lib/util/dif.o 00:03:57.154 CC lib/util/fd.o 00:03:57.154 CC lib/util/fd_group.o 00:03:57.154 CC lib/util/file.o 00:03:57.154 CC lib/util/hexlify.o 00:03:57.154 CC lib/util/iov.o 00:03:57.154 CC lib/util/math.o 00:03:57.154 CC lib/util/strerror_tls.o 00:03:57.154 CC lib/util/net.o 00:03:57.154 CC lib/util/pipe.o 00:03:57.154 CC lib/util/string.o 00:03:57.154 CC lib/util/uuid.o 00:03:57.154 CC lib/util/xor.o 00:03:57.154 CC lib/util/zipf.o 00:03:57.154 CC lib/util/md5.o 00:03:57.154 CC lib/vfio_user/host/vfio_user_pci.o 00:03:57.154 CC lib/vfio_user/host/vfio_user.o 00:03:57.154 LIB libspdk_dma.a 00:03:57.154 LIB libspdk_ioat.a 00:03:57.414 LIB libspdk_vfio_user.a 00:03:57.414 LIB libspdk_util.a 00:03:57.673 LIB libspdk_trace_parser.a 00:03:57.673 CC lib/vmd/vmd.o 00:03:57.673 CC lib/vmd/led.o 00:03:57.673 CC lib/env_dpdk/env.o 00:03:57.673 CC lib/env_dpdk/memory.o 00:03:57.673 CC lib/json/json_parse.o 00:03:57.673 CC lib/env_dpdk/pci.o 00:03:57.673 CC lib/json/json_util.o 00:03:57.673 CC lib/env_dpdk/init.o 00:03:57.673 CC lib/json/json_write.o 00:03:57.673 CC lib/idxd/idxd.o 00:03:57.673 CC lib/env_dpdk/threads.o 00:03:57.673 CC lib/env_dpdk/pci_ioat.o 00:03:57.673 CC lib/idxd/idxd_user.o 00:03:57.673 CC lib/env_dpdk/pci_virtio.o 00:03:57.673 CC lib/idxd/idxd_kernel.o 00:03:57.673 CC lib/rdma_utils/rdma_utils.o 00:03:57.673 CC lib/env_dpdk/pci_event.o 00:03:57.673 CC lib/env_dpdk/pci_vmd.o 00:03:57.673 CC lib/conf/conf.o 00:03:57.673 CC lib/env_dpdk/pci_idxd.o 00:03:57.673 CC lib/env_dpdk/sigbus_handler.o 00:03:57.673 CC lib/env_dpdk/pci_dpdk.o 00:03:57.673 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:57.673 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:57.933 LIB libspdk_conf.a 00:03:57.933 LIB libspdk_json.a 00:03:57.933 LIB libspdk_rdma_utils.a 00:03:58.193 LIB libspdk_idxd.a 00:03:58.193 LIB libspdk_vmd.a 00:03:58.193 CC lib/rdma_provider/common.o 00:03:58.193 CC lib/rdma_provider/rdma_provider_verbs.o 00:03:58.193 CC lib/jsonrpc/jsonrpc_server.o 00:03:58.193 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:58.193 CC lib/jsonrpc/jsonrpc_client.o 00:03:58.193 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:58.454 LIB libspdk_rdma_provider.a 00:03:58.454 LIB libspdk_jsonrpc.a 00:03:58.715 LIB libspdk_env_dpdk.a 00:03:58.715 CC lib/rpc/rpc.o 00:03:58.974 LIB libspdk_rpc.a 00:03:59.235 CC lib/notify/notify.o 00:03:59.235 CC lib/notify/notify_rpc.o 00:03:59.235 CC lib/trace/trace.o 00:03:59.235 CC lib/trace/trace_flags.o 00:03:59.235 CC lib/keyring/keyring.o 00:03:59.235 CC lib/trace/trace_rpc.o 00:03:59.235 CC lib/keyring/keyring_rpc.o 00:03:59.235 LIB libspdk_notify.a 00:03:59.495 LIB libspdk_trace.a 00:03:59.496 LIB libspdk_keyring.a 00:03:59.756 CC lib/thread/thread.o 00:03:59.756 CC lib/thread/iobuf.o 00:03:59.756 CC lib/sock/sock.o 00:03:59.756 CC lib/sock/sock_rpc.o 00:04:00.016 LIB libspdk_sock.a 00:04:00.276 CC lib/nvme/nvme_ctrlr_cmd.o 00:04:00.276 CC lib/nvme/nvme_ctrlr.o 00:04:00.276 CC lib/nvme/nvme_fabric.o 00:04:00.276 CC lib/nvme/nvme_ns_cmd.o 00:04:00.276 CC lib/nvme/nvme_ns.o 00:04:00.276 CC lib/nvme/nvme_pcie_common.o 00:04:00.276 CC lib/nvme/nvme_pcie.o 00:04:00.276 CC lib/nvme/nvme_qpair.o 00:04:00.276 CC lib/nvme/nvme.o 00:04:00.276 CC lib/nvme/nvme_quirks.o 00:04:00.276 CC lib/nvme/nvme_transport.o 00:04:00.276 CC lib/nvme/nvme_discovery.o 00:04:00.276 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:04:00.276 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:04:00.276 CC lib/nvme/nvme_tcp.o 00:04:00.276 CC lib/nvme/nvme_opal.o 00:04:00.276 CC lib/nvme/nvme_io_msg.o 00:04:00.276 CC lib/nvme/nvme_poll_group.o 00:04:00.276 CC lib/nvme/nvme_zns.o 00:04:00.276 CC lib/nvme/nvme_stubs.o 00:04:00.276 CC lib/nvme/nvme_auth.o 00:04:00.276 CC lib/nvme/nvme_cuse.o 00:04:00.276 CC lib/nvme/nvme_vfio_user.o 00:04:00.276 CC lib/nvme/nvme_rdma.o 00:04:00.550 LIB libspdk_thread.a 00:04:00.816 CC lib/accel/accel_rpc.o 00:04:00.816 CC lib/accel/accel_sw.o 00:04:00.816 CC lib/accel/accel.o 00:04:00.816 CC lib/virtio/virtio.o 00:04:00.816 CC lib/virtio/virtio_vhost_user.o 00:04:00.816 CC lib/virtio/virtio_vfio_user.o 00:04:00.816 CC lib/virtio/virtio_pci.o 00:04:00.816 CC lib/vfu_tgt/tgt_endpoint.o 00:04:00.816 CC lib/blob/blobstore.o 00:04:00.816 CC lib/blob/request.o 00:04:00.816 CC lib/init/subsystem.o 00:04:00.816 CC lib/vfu_tgt/tgt_rpc.o 00:04:00.816 CC lib/blob/zeroes.o 00:04:00.816 CC lib/init/json_config.o 00:04:00.816 CC lib/init/rpc.o 00:04:00.816 CC lib/blob/blob_bs_dev.o 00:04:00.816 CC lib/init/subsystem_rpc.o 00:04:00.816 CC lib/fsdev/fsdev.o 00:04:00.816 CC lib/fsdev/fsdev_io.o 00:04:00.816 CC lib/fsdev/fsdev_rpc.o 00:04:00.816 LIB libspdk_init.a 00:04:01.077 LIB libspdk_virtio.a 00:04:01.077 LIB libspdk_vfu_tgt.a 00:04:01.077 LIB libspdk_fsdev.a 00:04:01.336 CC lib/event/app.o 00:04:01.336 CC lib/event/scheduler_static.o 00:04:01.336 CC lib/event/reactor.o 00:04:01.336 CC lib/event/log_rpc.o 00:04:01.336 CC lib/event/app_rpc.o 00:04:01.336 CC lib/fuse_dispatcher/fuse_dispatcher.o 00:04:01.597 LIB libspdk_accel.a 00:04:01.597 LIB libspdk_event.a 00:04:01.597 LIB libspdk_nvme.a 00:04:01.857 CC lib/bdev/bdev.o 00:04:01.857 CC lib/bdev/bdev_zone.o 00:04:01.857 CC lib/bdev/bdev_rpc.o 00:04:01.857 CC lib/bdev/part.o 00:04:01.857 CC lib/bdev/scsi_nvme.o 00:04:01.857 LIB libspdk_fuse_dispatcher.a 00:04:02.428 LIB libspdk_blob.a 00:04:02.688 CC lib/blobfs/blobfs.o 00:04:02.688 CC lib/lvol/lvol.o 00:04:02.688 CC lib/blobfs/tree.o 00:04:03.259 LIB libspdk_lvol.a 00:04:03.259 LIB libspdk_blobfs.a 00:04:03.519 LIB libspdk_bdev.a 00:04:03.778 CC lib/ublk/ublk.o 00:04:03.778 CC lib/ublk/ublk_rpc.o 00:04:03.778 CC lib/nbd/nbd.o 00:04:03.778 CC lib/nbd/nbd_rpc.o 00:04:03.778 CC lib/ftl/ftl_core.o 00:04:03.778 CC lib/ftl/ftl_init.o 00:04:03.778 CC lib/scsi/dev.o 00:04:03.778 CC lib/ftl/ftl_layout.o 00:04:03.778 CC lib/scsi/lun.o 00:04:03.778 CC lib/nvmf/ctrlr.o 00:04:03.778 CC lib/ftl/ftl_sb.o 00:04:03.778 CC lib/scsi/scsi_bdev.o 00:04:03.778 CC lib/ftl/ftl_debug.o 00:04:03.778 CC lib/scsi/port.o 00:04:03.778 CC lib/nvmf/ctrlr_discovery.o 00:04:03.778 CC lib/ftl/ftl_io.o 00:04:03.778 CC lib/scsi/scsi.o 00:04:03.778 CC lib/nvmf/ctrlr_bdev.o 00:04:03.778 CC lib/ftl/ftl_l2p.o 00:04:03.778 CC lib/nvmf/subsystem.o 00:04:03.778 CC lib/scsi/scsi_pr.o 00:04:03.778 CC lib/ftl/ftl_l2p_flat.o 00:04:03.778 CC lib/scsi/scsi_rpc.o 00:04:03.778 CC lib/nvmf/nvmf.o 00:04:03.778 CC lib/ftl/ftl_nv_cache.o 00:04:03.778 CC lib/scsi/task.o 00:04:03.778 CC lib/nvmf/nvmf_rpc.o 00:04:03.778 CC lib/ftl/ftl_band.o 00:04:03.778 CC lib/nvmf/transport.o 00:04:03.778 CC lib/ftl/ftl_band_ops.o 00:04:03.778 CC lib/nvmf/tcp.o 00:04:03.778 CC lib/ftl/ftl_writer.o 00:04:03.778 CC lib/ftl/ftl_reloc.o 00:04:03.778 CC lib/ftl/ftl_rq.o 00:04:03.778 CC lib/nvmf/stubs.o 00:04:03.778 CC lib/nvmf/mdns_server.o 00:04:03.778 CC lib/nvmf/vfio_user.o 00:04:03.778 CC lib/ftl/ftl_l2p_cache.o 00:04:03.778 CC lib/nvmf/rdma.o 00:04:03.778 CC lib/ftl/ftl_p2l.o 00:04:03.778 CC lib/nvmf/auth.o 00:04:03.778 CC lib/ftl/ftl_p2l_log.o 00:04:03.778 CC lib/ftl/mngt/ftl_mngt.o 00:04:03.778 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:04:03.778 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:04:03.778 CC lib/ftl/mngt/ftl_mngt_startup.o 00:04:03.778 CC lib/ftl/mngt/ftl_mngt_md.o 00:04:03.778 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:04:03.778 CC lib/ftl/mngt/ftl_mngt_misc.o 00:04:03.778 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:04:03.778 CC lib/ftl/mngt/ftl_mngt_band.o 00:04:03.778 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:04:03.778 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:04:03.778 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:04:03.778 CC lib/ftl/utils/ftl_md.o 00:04:03.778 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:04:03.778 CC lib/ftl/utils/ftl_mempool.o 00:04:03.778 CC lib/ftl/utils/ftl_conf.o 00:04:03.778 CC lib/ftl/utils/ftl_property.o 00:04:03.778 CC lib/ftl/utils/ftl_bitmap.o 00:04:03.778 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:04:03.778 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:04:03.778 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:04:03.778 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:04:03.778 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:04:03.778 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:04:03.779 CC lib/ftl/upgrade/ftl_sb_v3.o 00:04:03.779 CC lib/ftl/upgrade/ftl_sb_v5.o 00:04:03.779 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:04:03.779 CC lib/ftl/nvc/ftl_nvc_dev.o 00:04:03.779 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:04:03.779 CC lib/ftl/nvc/ftl_nvc_bdev_non_vss.o 00:04:03.779 CC lib/ftl/nvc/ftl_nvc_bdev_common.o 00:04:03.779 CC lib/ftl/base/ftl_base_bdev.o 00:04:03.779 CC lib/ftl/base/ftl_base_dev.o 00:04:03.779 CC lib/ftl/ftl_trace.o 00:04:04.038 LIB libspdk_nbd.a 00:04:04.298 LIB libspdk_scsi.a 00:04:04.298 LIB libspdk_ublk.a 00:04:04.559 CC lib/iscsi/conn.o 00:04:04.559 CC lib/iscsi/init_grp.o 00:04:04.559 CC lib/iscsi/iscsi.o 00:04:04.559 CC lib/iscsi/param.o 00:04:04.559 CC lib/vhost/vhost_scsi.o 00:04:04.559 CC lib/vhost/vhost.o 00:04:04.559 CC lib/iscsi/portal_grp.o 00:04:04.559 CC lib/vhost/vhost_rpc.o 00:04:04.559 CC lib/iscsi/tgt_node.o 00:04:04.559 CC lib/iscsi/iscsi_subsystem.o 00:04:04.559 CC lib/vhost/vhost_blk.o 00:04:04.559 CC lib/vhost/rte_vhost_user.o 00:04:04.559 CC lib/iscsi/iscsi_rpc.o 00:04:04.559 CC lib/iscsi/task.o 00:04:04.559 LIB libspdk_ftl.a 00:04:05.130 LIB libspdk_nvmf.a 00:04:05.130 LIB libspdk_vhost.a 00:04:05.130 LIB libspdk_iscsi.a 00:04:05.699 CC module/vfu_device/vfu_virtio.o 00:04:05.699 CC module/vfu_device/vfu_virtio_scsi.o 00:04:05.699 CC module/vfu_device/vfu_virtio_blk.o 00:04:05.699 CC module/vfu_device/vfu_virtio_rpc.o 00:04:05.699 CC module/vfu_device/vfu_virtio_fs.o 00:04:05.699 CC module/env_dpdk/env_dpdk_rpc.o 00:04:05.699 CC module/accel/iaa/accel_iaa.o 00:04:05.699 CC module/accel/iaa/accel_iaa_rpc.o 00:04:05.959 LIB libspdk_env_dpdk_rpc.a 00:04:05.959 CC module/accel/error/accel_error_rpc.o 00:04:05.959 CC module/accel/error/accel_error.o 00:04:05.959 CC module/keyring/file/keyring.o 00:04:05.959 CC module/keyring/file/keyring_rpc.o 00:04:05.959 CC module/sock/posix/posix.o 00:04:05.959 CC module/accel/ioat/accel_ioat_rpc.o 00:04:05.959 CC module/accel/dsa/accel_dsa.o 00:04:05.959 CC module/accel/ioat/accel_ioat.o 00:04:05.959 CC module/accel/dsa/accel_dsa_rpc.o 00:04:05.959 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:04:05.959 CC module/blob/bdev/blob_bdev.o 00:04:05.959 CC module/fsdev/aio/fsdev_aio.o 00:04:05.959 CC module/fsdev/aio/linux_aio_mgr.o 00:04:05.959 CC module/fsdev/aio/fsdev_aio_rpc.o 00:04:05.959 CC module/scheduler/gscheduler/gscheduler.o 00:04:05.959 CC module/scheduler/dynamic/scheduler_dynamic.o 00:04:05.959 CC module/keyring/linux/keyring.o 00:04:05.959 CC module/keyring/linux/keyring_rpc.o 00:04:05.959 LIB libspdk_keyring_file.a 00:04:05.959 LIB libspdk_accel_iaa.a 00:04:05.959 LIB libspdk_scheduler_dpdk_governor.a 00:04:05.959 LIB libspdk_scheduler_gscheduler.a 00:04:05.959 LIB libspdk_keyring_linux.a 00:04:05.959 LIB libspdk_accel_error.a 00:04:05.959 LIB libspdk_accel_ioat.a 00:04:05.959 LIB libspdk_scheduler_dynamic.a 00:04:05.959 LIB libspdk_blob_bdev.a 00:04:05.959 LIB libspdk_accel_dsa.a 00:04:06.220 LIB libspdk_vfu_device.a 00:04:06.220 LIB libspdk_sock_posix.a 00:04:06.220 LIB libspdk_fsdev_aio.a 00:04:06.479 CC module/bdev/malloc/bdev_malloc.o 00:04:06.479 CC module/bdev/malloc/bdev_malloc_rpc.o 00:04:06.479 CC module/bdev/passthru/vbdev_passthru.o 00:04:06.479 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:04:06.479 CC module/bdev/null/bdev_null.o 00:04:06.479 CC module/bdev/delay/vbdev_delay.o 00:04:06.479 CC module/bdev/raid/bdev_raid.o 00:04:06.479 CC module/bdev/raid/bdev_raid_rpc.o 00:04:06.479 CC module/bdev/delay/vbdev_delay_rpc.o 00:04:06.479 CC module/bdev/raid/bdev_raid_sb.o 00:04:06.479 CC module/bdev/raid/raid0.o 00:04:06.479 CC module/bdev/raid/raid1.o 00:04:06.479 CC module/bdev/null/bdev_null_rpc.o 00:04:06.479 CC module/bdev/raid/concat.o 00:04:06.479 CC module/bdev/iscsi/bdev_iscsi.o 00:04:06.479 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:04:06.479 CC module/bdev/zone_block/vbdev_zone_block.o 00:04:06.479 CC module/bdev/nvme/bdev_nvme_rpc.o 00:04:06.479 CC module/bdev/nvme/bdev_nvme.o 00:04:06.479 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:04:06.479 CC module/bdev/lvol/vbdev_lvol.o 00:04:06.479 CC module/bdev/nvme/nvme_rpc.o 00:04:06.479 CC module/bdev/nvme/bdev_mdns_client.o 00:04:06.479 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:04:06.479 CC module/bdev/nvme/vbdev_opal.o 00:04:06.479 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:04:06.479 CC module/bdev/nvme/vbdev_opal_rpc.o 00:04:06.479 CC module/bdev/error/vbdev_error.o 00:04:06.479 CC module/bdev/error/vbdev_error_rpc.o 00:04:06.479 CC module/blobfs/bdev/blobfs_bdev.o 00:04:06.479 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:04:06.479 CC module/bdev/split/vbdev_split.o 00:04:06.479 CC module/bdev/gpt/gpt.o 00:04:06.479 CC module/bdev/split/vbdev_split_rpc.o 00:04:06.479 CC module/bdev/gpt/vbdev_gpt.o 00:04:06.479 CC module/bdev/virtio/bdev_virtio_scsi.o 00:04:06.479 CC module/bdev/ftl/bdev_ftl_rpc.o 00:04:06.479 CC module/bdev/ftl/bdev_ftl.o 00:04:06.479 CC module/bdev/virtio/bdev_virtio_blk.o 00:04:06.479 CC module/bdev/virtio/bdev_virtio_rpc.o 00:04:06.479 CC module/bdev/aio/bdev_aio.o 00:04:06.479 CC module/bdev/aio/bdev_aio_rpc.o 00:04:06.738 LIB libspdk_blobfs_bdev.a 00:04:06.738 LIB libspdk_bdev_split.a 00:04:06.738 LIB libspdk_bdev_error.a 00:04:06.738 LIB libspdk_bdev_null.a 00:04:06.738 LIB libspdk_bdev_passthru.a 00:04:06.738 LIB libspdk_bdev_gpt.a 00:04:06.738 LIB libspdk_bdev_ftl.a 00:04:06.738 LIB libspdk_bdev_zone_block.a 00:04:06.738 LIB libspdk_bdev_malloc.a 00:04:06.738 LIB libspdk_bdev_iscsi.a 00:04:06.738 LIB libspdk_bdev_delay.a 00:04:06.738 LIB libspdk_bdev_aio.a 00:04:06.738 LIB libspdk_bdev_lvol.a 00:04:06.738 LIB libspdk_bdev_virtio.a 00:04:06.997 LIB libspdk_bdev_raid.a 00:04:07.938 LIB libspdk_bdev_nvme.a 00:04:08.509 CC module/event/subsystems/scheduler/scheduler.o 00:04:08.509 CC module/event/subsystems/sock/sock.o 00:04:08.509 CC module/event/subsystems/vmd/vmd.o 00:04:08.509 CC module/event/subsystems/vmd/vmd_rpc.o 00:04:08.509 CC module/event/subsystems/iobuf/iobuf.o 00:04:08.509 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:04:08.509 CC module/event/subsystems/keyring/keyring.o 00:04:08.509 CC module/event/subsystems/fsdev/fsdev.o 00:04:08.509 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:04:08.509 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:04:08.509 LIB libspdk_event_keyring.a 00:04:08.509 LIB libspdk_event_vmd.a 00:04:08.509 LIB libspdk_event_scheduler.a 00:04:08.509 LIB libspdk_event_sock.a 00:04:08.509 LIB libspdk_event_fsdev.a 00:04:08.509 LIB libspdk_event_vhost_blk.a 00:04:08.509 LIB libspdk_event_vfu_tgt.a 00:04:08.509 LIB libspdk_event_iobuf.a 00:04:09.080 CC module/event/subsystems/accel/accel.o 00:04:09.080 LIB libspdk_event_accel.a 00:04:09.341 CC module/event/subsystems/bdev/bdev.o 00:04:09.341 LIB libspdk_event_bdev.a 00:04:09.913 CC module/event/subsystems/scsi/scsi.o 00:04:09.913 CC module/event/subsystems/ublk/ublk.o 00:04:09.913 CC module/event/subsystems/nbd/nbd.o 00:04:09.913 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:04:09.913 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:04:09.913 LIB libspdk_event_ublk.a 00:04:09.913 LIB libspdk_event_scsi.a 00:04:09.913 LIB libspdk_event_nbd.a 00:04:09.913 LIB libspdk_event_nvmf.a 00:04:10.174 CC module/event/subsystems/iscsi/iscsi.o 00:04:10.174 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:04:10.434 LIB libspdk_event_vhost_scsi.a 00:04:10.434 LIB libspdk_event_iscsi.a 00:04:10.700 CC app/spdk_nvme_identify/identify.o 00:04:10.700 CXX app/trace/trace.o 00:04:10.700 CC app/trace_record/trace_record.o 00:04:10.700 CC app/spdk_lspci/spdk_lspci.o 00:04:10.700 CC app/spdk_nvme_discover/discovery_aer.o 00:04:10.700 CC app/spdk_top/spdk_top.o 00:04:10.700 TEST_HEADER include/spdk/assert.h 00:04:10.700 TEST_HEADER include/spdk/accel_module.h 00:04:10.700 TEST_HEADER include/spdk/accel.h 00:04:10.700 CC test/rpc_client/rpc_client_test.o 00:04:10.700 TEST_HEADER include/spdk/barrier.h 00:04:10.700 TEST_HEADER include/spdk/base64.h 00:04:10.700 TEST_HEADER include/spdk/bdev.h 00:04:10.700 TEST_HEADER include/spdk/bdev_module.h 00:04:10.700 TEST_HEADER include/spdk/bit_array.h 00:04:10.700 TEST_HEADER include/spdk/bdev_zone.h 00:04:10.700 TEST_HEADER include/spdk/bit_pool.h 00:04:10.700 TEST_HEADER include/spdk/blob_bdev.h 00:04:10.700 TEST_HEADER include/spdk/blobfs_bdev.h 00:04:10.700 TEST_HEADER include/spdk/blob.h 00:04:10.700 TEST_HEADER include/spdk/blobfs.h 00:04:10.700 CC app/spdk_nvme_perf/perf.o 00:04:10.700 TEST_HEADER include/spdk/conf.h 00:04:10.700 TEST_HEADER include/spdk/config.h 00:04:10.700 TEST_HEADER include/spdk/cpuset.h 00:04:10.700 TEST_HEADER include/spdk/crc32.h 00:04:10.700 TEST_HEADER include/spdk/crc16.h 00:04:10.700 TEST_HEADER include/spdk/crc64.h 00:04:10.700 TEST_HEADER include/spdk/dif.h 00:04:10.700 TEST_HEADER include/spdk/endian.h 00:04:10.700 TEST_HEADER include/spdk/dma.h 00:04:10.700 TEST_HEADER include/spdk/env.h 00:04:10.700 TEST_HEADER include/spdk/env_dpdk.h 00:04:10.700 TEST_HEADER include/spdk/fd_group.h 00:04:10.700 TEST_HEADER include/spdk/event.h 00:04:10.700 TEST_HEADER include/spdk/file.h 00:04:10.700 TEST_HEADER include/spdk/fd.h 00:04:10.700 TEST_HEADER include/spdk/fsdev_module.h 00:04:10.700 TEST_HEADER include/spdk/fsdev.h 00:04:10.700 TEST_HEADER include/spdk/ftl.h 00:04:10.700 TEST_HEADER include/spdk/fuse_dispatcher.h 00:04:10.700 TEST_HEADER include/spdk/gpt_spec.h 00:04:10.700 TEST_HEADER include/spdk/idxd.h 00:04:10.700 TEST_HEADER include/spdk/hexlify.h 00:04:10.700 TEST_HEADER include/spdk/idxd_spec.h 00:04:10.700 TEST_HEADER include/spdk/histogram_data.h 00:04:10.700 CC examples/interrupt_tgt/interrupt_tgt.o 00:04:10.700 TEST_HEADER include/spdk/init.h 00:04:10.700 TEST_HEADER include/spdk/ioat_spec.h 00:04:10.700 TEST_HEADER include/spdk/ioat.h 00:04:10.700 TEST_HEADER include/spdk/iscsi_spec.h 00:04:10.700 TEST_HEADER include/spdk/json.h 00:04:10.700 TEST_HEADER include/spdk/jsonrpc.h 00:04:10.700 CC app/iscsi_tgt/iscsi_tgt.o 00:04:10.700 TEST_HEADER include/spdk/keyring.h 00:04:10.700 TEST_HEADER include/spdk/likely.h 00:04:10.700 TEST_HEADER include/spdk/keyring_module.h 00:04:10.700 TEST_HEADER include/spdk/log.h 00:04:10.700 TEST_HEADER include/spdk/lvol.h 00:04:10.700 TEST_HEADER include/spdk/md5.h 00:04:10.700 TEST_HEADER include/spdk/memory.h 00:04:10.700 TEST_HEADER include/spdk/mmio.h 00:04:10.700 CC app/spdk_dd/spdk_dd.o 00:04:10.700 TEST_HEADER include/spdk/nbd.h 00:04:10.700 CC app/nvmf_tgt/nvmf_main.o 00:04:10.700 TEST_HEADER include/spdk/nvme.h 00:04:10.700 TEST_HEADER include/spdk/net.h 00:04:10.700 TEST_HEADER include/spdk/notify.h 00:04:10.700 TEST_HEADER include/spdk/nvme_intel.h 00:04:10.700 TEST_HEADER include/spdk/nvme_ocssd.h 00:04:10.700 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:04:10.700 TEST_HEADER include/spdk/nvme_zns.h 00:04:10.700 TEST_HEADER include/spdk/nvmf_cmd.h 00:04:10.700 TEST_HEADER include/spdk/nvme_spec.h 00:04:10.700 TEST_HEADER include/spdk/nvmf.h 00:04:10.700 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:04:10.700 TEST_HEADER include/spdk/nvmf_spec.h 00:04:10.700 TEST_HEADER include/spdk/nvmf_transport.h 00:04:10.700 CC app/spdk_tgt/spdk_tgt.o 00:04:10.700 TEST_HEADER include/spdk/opal.h 00:04:10.700 TEST_HEADER include/spdk/pci_ids.h 00:04:10.700 TEST_HEADER include/spdk/opal_spec.h 00:04:10.700 TEST_HEADER include/spdk/queue.h 00:04:10.700 TEST_HEADER include/spdk/pipe.h 00:04:10.700 TEST_HEADER include/spdk/reduce.h 00:04:10.700 TEST_HEADER include/spdk/scsi.h 00:04:10.700 TEST_HEADER include/spdk/rpc.h 00:04:10.700 TEST_HEADER include/spdk/scsi_spec.h 00:04:10.700 TEST_HEADER include/spdk/scheduler.h 00:04:10.700 TEST_HEADER include/spdk/sock.h 00:04:10.700 TEST_HEADER include/spdk/thread.h 00:04:10.700 TEST_HEADER include/spdk/string.h 00:04:10.700 TEST_HEADER include/spdk/trace.h 00:04:10.700 TEST_HEADER include/spdk/tree.h 00:04:10.700 TEST_HEADER include/spdk/stdinc.h 00:04:10.700 TEST_HEADER include/spdk/trace_parser.h 00:04:10.700 TEST_HEADER include/spdk/uuid.h 00:04:10.700 TEST_HEADER include/spdk/ublk.h 00:04:10.700 TEST_HEADER include/spdk/version.h 00:04:10.700 TEST_HEADER include/spdk/util.h 00:04:10.700 TEST_HEADER include/spdk/vfio_user_pci.h 00:04:10.700 TEST_HEADER include/spdk/vfio_user_spec.h 00:04:10.700 TEST_HEADER include/spdk/vhost.h 00:04:10.700 TEST_HEADER include/spdk/vmd.h 00:04:10.701 TEST_HEADER include/spdk/xor.h 00:04:10.701 CXX test/cpp_headers/accel.o 00:04:10.701 TEST_HEADER include/spdk/zipf.h 00:04:10.701 CXX test/cpp_headers/accel_module.o 00:04:10.701 CXX test/cpp_headers/assert.o 00:04:10.701 CXX test/cpp_headers/base64.o 00:04:10.701 CXX test/cpp_headers/barrier.o 00:04:10.701 CXX test/cpp_headers/bdev_module.o 00:04:10.701 CXX test/cpp_headers/bdev.o 00:04:10.701 CXX test/cpp_headers/bit_array.o 00:04:10.701 CXX test/cpp_headers/bdev_zone.o 00:04:10.701 CXX test/cpp_headers/blob_bdev.o 00:04:10.701 CXX test/cpp_headers/blobfs_bdev.o 00:04:10.701 CXX test/cpp_headers/bit_pool.o 00:04:10.701 CXX test/cpp_headers/blobfs.o 00:04:10.701 CXX test/cpp_headers/blob.o 00:04:10.701 CXX test/cpp_headers/crc16.o 00:04:10.701 CXX test/cpp_headers/config.o 00:04:10.701 CXX test/cpp_headers/conf.o 00:04:10.701 CXX test/cpp_headers/cpuset.o 00:04:10.701 CXX test/cpp_headers/dif.o 00:04:10.701 CXX test/cpp_headers/crc64.o 00:04:10.701 CXX test/cpp_headers/endian.o 00:04:10.701 CXX test/cpp_headers/crc32.o 00:04:10.701 CXX test/cpp_headers/dma.o 00:04:10.701 CC test/thread/poller_perf/poller_perf.o 00:04:10.701 CXX test/cpp_headers/env_dpdk.o 00:04:10.701 CXX test/cpp_headers/fd_group.o 00:04:10.701 CXX test/cpp_headers/env.o 00:04:10.701 CXX test/cpp_headers/fd.o 00:04:10.701 CC test/app/jsoncat/jsoncat.o 00:04:10.701 CXX test/cpp_headers/file.o 00:04:10.701 CXX test/cpp_headers/event.o 00:04:10.701 CXX test/cpp_headers/fsdev.o 00:04:10.701 CXX test/cpp_headers/fsdev_module.o 00:04:10.701 CXX test/cpp_headers/ftl.o 00:04:10.701 CC test/app/stub/stub.o 00:04:10.701 CXX test/cpp_headers/fuse_dispatcher.o 00:04:10.701 CXX test/cpp_headers/hexlify.o 00:04:10.701 CXX test/cpp_headers/histogram_data.o 00:04:10.701 CXX test/cpp_headers/gpt_spec.o 00:04:10.701 CXX test/cpp_headers/idxd.o 00:04:10.701 CC test/app/histogram_perf/histogram_perf.o 00:04:10.701 CC test/thread/lock/spdk_lock.o 00:04:10.701 CC examples/ioat/perf/perf.o 00:04:10.701 CXX test/cpp_headers/idxd_spec.o 00:04:10.701 CXX test/cpp_headers/ioat.o 00:04:10.701 CXX test/cpp_headers/init.o 00:04:10.701 CC test/env/pci/pci_ut.o 00:04:10.701 CXX test/cpp_headers/ioat_spec.o 00:04:10.701 CXX test/cpp_headers/iscsi_spec.o 00:04:10.701 CXX test/cpp_headers/json.o 00:04:10.701 CXX test/cpp_headers/jsonrpc.o 00:04:10.701 CXX test/cpp_headers/keyring.o 00:04:10.701 CXX test/cpp_headers/keyring_module.o 00:04:10.701 CXX test/cpp_headers/log.o 00:04:10.701 CXX test/cpp_headers/likely.o 00:04:10.701 CXX test/cpp_headers/lvol.o 00:04:10.701 CXX test/cpp_headers/md5.o 00:04:10.701 CXX test/cpp_headers/mmio.o 00:04:10.701 LINK spdk_lspci 00:04:10.701 CXX test/cpp_headers/memory.o 00:04:10.701 CXX test/cpp_headers/nbd.o 00:04:10.701 CXX test/cpp_headers/net.o 00:04:10.701 CXX test/cpp_headers/notify.o 00:04:10.701 CXX test/cpp_headers/nvme.o 00:04:10.701 CXX test/cpp_headers/nvme_intel.o 00:04:10.701 CXX test/cpp_headers/nvme_ocssd.o 00:04:10.701 CXX test/cpp_headers/nvme_ocssd_spec.o 00:04:10.701 CC examples/ioat/verify/verify.o 00:04:10.701 CC examples/util/zipf/zipf.o 00:04:10.701 CXX test/cpp_headers/nvme_spec.o 00:04:10.701 CXX test/cpp_headers/nvme_zns.o 00:04:10.701 CC test/env/memory/memory_ut.o 00:04:10.701 CXX test/cpp_headers/nvmf_cmd.o 00:04:10.701 CXX test/cpp_headers/nvmf_fc_spec.o 00:04:10.701 CXX test/cpp_headers/nvmf.o 00:04:10.701 CXX test/cpp_headers/nvmf_spec.o 00:04:10.701 CXX test/cpp_headers/nvmf_transport.o 00:04:10.701 CXX test/cpp_headers/opal.o 00:04:10.701 CXX test/cpp_headers/opal_spec.o 00:04:10.701 CXX test/cpp_headers/pci_ids.o 00:04:10.701 CXX test/cpp_headers/pipe.o 00:04:10.701 CXX test/cpp_headers/queue.o 00:04:10.701 CXX test/cpp_headers/reduce.o 00:04:10.701 CC test/env/vtophys/vtophys.o 00:04:10.701 CXX test/cpp_headers/scheduler.o 00:04:10.701 CXX test/cpp_headers/rpc.o 00:04:10.701 CXX test/cpp_headers/scsi.o 00:04:10.701 CXX test/cpp_headers/scsi_spec.o 00:04:10.701 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:04:10.701 CXX test/cpp_headers/sock.o 00:04:10.962 CXX test/cpp_headers/stdinc.o 00:04:10.962 LINK rpc_client_test 00:04:10.962 CC app/fio/nvme/fio_plugin.o 00:04:10.962 CXX test/cpp_headers/string.o 00:04:10.962 CC test/dma/test_dma/test_dma.o 00:04:10.962 CC test/app/bdev_svc/bdev_svc.o 00:04:10.962 CXX test/cpp_headers/thread.o 00:04:10.962 LINK spdk_nvme_discover 00:04:10.962 CC app/fio/bdev/fio_plugin.o 00:04:10.962 LINK spdk_trace_record 00:04:10.962 CC test/env/mem_callbacks/mem_callbacks.o 00:04:10.962 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:04:10.962 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:04:10.962 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:04:10.962 LINK interrupt_tgt 00:04:10.962 CXX test/cpp_headers/trace.o 00:04:10.962 LINK nvmf_tgt 00:04:10.962 LINK poller_perf 00:04:10.962 CXX test/cpp_headers/trace_parser.o 00:04:10.962 CXX test/cpp_headers/tree.o 00:04:10.962 CXX test/cpp_headers/ublk.o 00:04:10.962 LINK jsoncat 00:04:10.962 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:04:10.962 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:04:10.962 CXX test/cpp_headers/util.o 00:04:10.962 CXX test/cpp_headers/uuid.o 00:04:10.962 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:04:10.962 CXX test/cpp_headers/version.o 00:04:10.962 LINK histogram_perf 00:04:10.962 CXX test/cpp_headers/vfio_user_pci.o 00:04:10.962 CXX test/cpp_headers/vfio_user_spec.o 00:04:10.962 CXX test/cpp_headers/vhost.o 00:04:10.962 CXX test/cpp_headers/vmd.o 00:04:10.962 LINK iscsi_tgt 00:04:10.962 CXX test/cpp_headers/xor.o 00:04:10.962 CXX test/cpp_headers/zipf.o 00:04:10.962 LINK vtophys 00:04:10.962 LINK zipf 00:04:10.962 LINK stub 00:04:10.962 LINK spdk_tgt 00:04:10.962 LINK env_dpdk_post_init 00:04:10.962 LINK ioat_perf 00:04:10.962 LINK verify 00:04:10.962 LINK spdk_trace 00:04:11.223 LINK bdev_svc 00:04:11.223 LINK mem_callbacks 00:04:11.223 LINK nvme_fuzz 00:04:11.223 LINK llvm_vfio_fuzz 00:04:11.223 LINK spdk_dd 00:04:11.223 LINK pci_ut 00:04:11.223 LINK vhost_fuzz 00:04:11.223 LINK spdk_nvme_identify 00:04:11.223 LINK test_dma 00:04:11.483 LINK spdk_nvme_perf 00:04:11.483 LINK memory_ut 00:04:11.483 LINK llvm_nvme_fuzz 00:04:11.483 LINK spdk_bdev 00:04:11.483 LINK spdk_nvme 00:04:11.483 LINK spdk_top 00:04:11.483 CC app/vhost/vhost.o 00:04:11.483 CC examples/idxd/perf/perf.o 00:04:11.743 CC examples/sock/hello_world/hello_sock.o 00:04:11.743 CC examples/vmd/led/led.o 00:04:11.743 CC examples/vmd/lsvmd/lsvmd.o 00:04:11.743 CC examples/thread/thread/thread_ex.o 00:04:11.743 LINK vhost 00:04:11.743 LINK led 00:04:11.743 LINK lsvmd 00:04:11.743 LINK hello_sock 00:04:11.743 LINK idxd_perf 00:04:12.004 LINK thread 00:04:12.004 LINK spdk_lock 00:04:12.004 LINK iscsi_fuzz 00:04:12.575 CC examples/nvme/hello_world/hello_world.o 00:04:12.575 CC examples/nvme/nvme_manage/nvme_manage.o 00:04:12.575 CC examples/nvme/arbitration/arbitration.o 00:04:12.575 CC examples/nvme/hotplug/hotplug.o 00:04:12.575 CC examples/nvme/cmb_copy/cmb_copy.o 00:04:12.575 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:04:12.575 CC examples/nvme/reconnect/reconnect.o 00:04:12.575 CC examples/nvme/abort/abort.o 00:04:12.575 CC test/event/reactor/reactor.o 00:04:12.575 CC test/event/event_perf/event_perf.o 00:04:12.575 CC test/event/reactor_perf/reactor_perf.o 00:04:12.575 CC test/event/app_repeat/app_repeat.o 00:04:12.575 CC test/event/scheduler/scheduler.o 00:04:12.575 LINK pmr_persistence 00:04:12.575 LINK cmb_copy 00:04:12.575 LINK hello_world 00:04:12.834 LINK hotplug 00:04:12.834 LINK reactor 00:04:12.834 LINK reactor_perf 00:04:12.834 LINK event_perf 00:04:12.834 LINK arbitration 00:04:12.834 LINK reconnect 00:04:12.834 LINK abort 00:04:12.834 LINK app_repeat 00:04:12.834 LINK nvme_manage 00:04:12.834 LINK scheduler 00:04:13.094 CC test/nvme/overhead/overhead.o 00:04:13.094 CC test/nvme/simple_copy/simple_copy.o 00:04:13.094 CC test/nvme/startup/startup.o 00:04:13.094 CC test/nvme/reset/reset.o 00:04:13.094 CC test/nvme/cuse/cuse.o 00:04:13.094 CC test/nvme/e2edp/nvme_dp.o 00:04:13.094 CC test/nvme/aer/aer.o 00:04:13.094 CC test/nvme/compliance/nvme_compliance.o 00:04:13.094 CC test/nvme/connect_stress/connect_stress.o 00:04:13.094 CC test/nvme/fdp/fdp.o 00:04:13.094 CC test/nvme/reserve/reserve.o 00:04:13.094 CC test/nvme/boot_partition/boot_partition.o 00:04:13.094 CC test/nvme/doorbell_aers/doorbell_aers.o 00:04:13.094 CC test/nvme/sgl/sgl.o 00:04:13.094 CC test/nvme/fused_ordering/fused_ordering.o 00:04:13.094 CC test/nvme/err_injection/err_injection.o 00:04:13.094 CC test/blobfs/mkfs/mkfs.o 00:04:13.094 CC test/accel/dif/dif.o 00:04:13.094 CC test/lvol/esnap/esnap.o 00:04:13.094 LINK startup 00:04:13.094 LINK connect_stress 00:04:13.094 LINK boot_partition 00:04:13.094 LINK err_injection 00:04:13.094 LINK doorbell_aers 00:04:13.094 LINK reserve 00:04:13.094 LINK simple_copy 00:04:13.094 LINK fused_ordering 00:04:13.094 LINK reset 00:04:13.094 LINK nvme_dp 00:04:13.094 LINK overhead 00:04:13.094 LINK aer 00:04:13.360 LINK fdp 00:04:13.360 LINK sgl 00:04:13.360 LINK mkfs 00:04:13.360 LINK nvme_compliance 00:04:13.620 LINK dif 00:04:13.620 CC examples/accel/perf/accel_perf.o 00:04:13.620 CC examples/fsdev/hello_world/hello_fsdev.o 00:04:13.620 CC examples/blob/hello_world/hello_blob.o 00:04:13.620 CC examples/blob/cli/blobcli.o 00:04:13.880 LINK hello_blob 00:04:13.880 LINK hello_fsdev 00:04:13.880 LINK accel_perf 00:04:13.880 LINK cuse 00:04:13.880 LINK blobcli 00:04:14.820 CC examples/bdev/hello_world/hello_bdev.o 00:04:14.820 CC examples/bdev/bdevperf/bdevperf.o 00:04:14.820 LINK hello_bdev 00:04:15.081 CC test/bdev/bdevio/bdevio.o 00:04:15.081 LINK bdevperf 00:04:15.341 LINK bdevio 00:04:16.726 LINK esnap 00:04:16.726 CC examples/nvmf/nvmf/nvmf.o 00:04:16.986 LINK nvmf 00:04:18.371 00:04:18.371 real 0m36.893s 00:04:18.371 user 4m39.677s 00:04:18.371 sys 1m44.875s 00:04:18.371 04:20:56 make -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:04:18.371 04:20:56 make -- common/autotest_common.sh@10 -- $ set +x 00:04:18.371 ************************************ 00:04:18.371 END TEST make 00:04:18.371 ************************************ 00:04:18.371 04:20:56 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:04:18.371 04:20:56 -- pm/common@29 -- $ signal_monitor_resources TERM 00:04:18.371 04:20:56 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:04:18.371 04:20:56 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:18.371 04:20:56 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:04:18.371 04:20:56 -- pm/common@44 -- $ pid=6227 00:04:18.371 04:20:56 -- pm/common@50 -- $ kill -TERM 6227 00:04:18.371 04:20:56 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:18.371 04:20:56 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:04:18.371 04:20:56 -- pm/common@44 -- $ pid=6229 00:04:18.371 04:20:56 -- pm/common@50 -- $ kill -TERM 6229 00:04:18.371 04:20:56 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:18.371 04:20:56 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:04:18.371 04:20:56 -- pm/common@44 -- $ pid=6232 00:04:18.371 04:20:56 -- pm/common@50 -- $ kill -TERM 6232 00:04:18.371 04:20:56 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:18.371 04:20:56 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:04:18.371 04:20:56 -- pm/common@44 -- $ pid=6255 00:04:18.371 04:20:56 -- pm/common@50 -- $ sudo -E kill -TERM 6255 00:04:18.371 04:20:56 -- spdk/autorun.sh@26 -- $ (( SPDK_TEST_UNITTEST == 1 || SPDK_RUN_FUNCTIONAL_TEST == 1 )) 00:04:18.371 04:20:56 -- spdk/autorun.sh@27 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:04:18.371 04:20:56 -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:18.371 04:20:56 -- common/autotest_common.sh@1693 -- # lcov --version 00:04:18.371 04:20:56 -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:18.371 04:20:57 -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:18.371 04:20:57 -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:18.371 04:20:57 -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:18.371 04:20:57 -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:18.371 04:20:57 -- scripts/common.sh@336 -- # IFS=.-: 00:04:18.371 04:20:57 -- scripts/common.sh@336 -- # read -ra ver1 00:04:18.371 04:20:57 -- scripts/common.sh@337 -- # IFS=.-: 00:04:18.371 04:20:57 -- scripts/common.sh@337 -- # read -ra ver2 00:04:18.371 04:20:57 -- scripts/common.sh@338 -- # local 'op=<' 00:04:18.371 04:20:57 -- scripts/common.sh@340 -- # ver1_l=2 00:04:18.371 04:20:57 -- scripts/common.sh@341 -- # ver2_l=1 00:04:18.371 04:20:57 -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:18.371 04:20:57 -- scripts/common.sh@344 -- # case "$op" in 00:04:18.371 04:20:57 -- scripts/common.sh@345 -- # : 1 00:04:18.371 04:20:57 -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:18.371 04:20:57 -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:18.371 04:20:57 -- scripts/common.sh@365 -- # decimal 1 00:04:18.371 04:20:57 -- scripts/common.sh@353 -- # local d=1 00:04:18.371 04:20:57 -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:18.371 04:20:57 -- scripts/common.sh@355 -- # echo 1 00:04:18.371 04:20:57 -- scripts/common.sh@365 -- # ver1[v]=1 00:04:18.371 04:20:57 -- scripts/common.sh@366 -- # decimal 2 00:04:18.371 04:20:57 -- scripts/common.sh@353 -- # local d=2 00:04:18.371 04:20:57 -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:18.371 04:20:57 -- scripts/common.sh@355 -- # echo 2 00:04:18.371 04:20:57 -- scripts/common.sh@366 -- # ver2[v]=2 00:04:18.371 04:20:57 -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:18.371 04:20:57 -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:18.371 04:20:57 -- scripts/common.sh@368 -- # return 0 00:04:18.371 04:20:57 -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:18.371 04:20:57 -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:18.371 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:18.371 --rc genhtml_branch_coverage=1 00:04:18.371 --rc genhtml_function_coverage=1 00:04:18.371 --rc genhtml_legend=1 00:04:18.371 --rc geninfo_all_blocks=1 00:04:18.371 --rc geninfo_unexecuted_blocks=1 00:04:18.371 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:18.371 ' 00:04:18.371 04:20:57 -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:18.371 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:18.371 --rc genhtml_branch_coverage=1 00:04:18.371 --rc genhtml_function_coverage=1 00:04:18.371 --rc genhtml_legend=1 00:04:18.371 --rc geninfo_all_blocks=1 00:04:18.371 --rc geninfo_unexecuted_blocks=1 00:04:18.371 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:18.371 ' 00:04:18.371 04:20:57 -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:18.371 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:18.371 --rc genhtml_branch_coverage=1 00:04:18.371 --rc genhtml_function_coverage=1 00:04:18.371 --rc genhtml_legend=1 00:04:18.371 --rc geninfo_all_blocks=1 00:04:18.371 --rc geninfo_unexecuted_blocks=1 00:04:18.371 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:18.371 ' 00:04:18.371 04:20:57 -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:18.371 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:18.371 --rc genhtml_branch_coverage=1 00:04:18.371 --rc genhtml_function_coverage=1 00:04:18.371 --rc genhtml_legend=1 00:04:18.371 --rc geninfo_all_blocks=1 00:04:18.371 --rc geninfo_unexecuted_blocks=1 00:04:18.371 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:18.371 ' 00:04:18.371 04:20:57 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:04:18.371 04:20:57 -- nvmf/common.sh@7 -- # uname -s 00:04:18.371 04:20:57 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:18.371 04:20:57 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:18.371 04:20:57 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:18.371 04:20:57 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:18.371 04:20:57 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:18.371 04:20:57 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:18.371 04:20:57 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:18.371 04:20:57 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:18.371 04:20:57 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:18.371 04:20:57 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:18.371 04:20:57 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:04:18.371 04:20:57 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:04:18.371 04:20:57 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:18.371 04:20:57 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:18.371 04:20:57 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:18.371 04:20:57 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:04:18.371 04:20:57 -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:04:18.371 04:20:57 -- scripts/common.sh@15 -- # shopt -s extglob 00:04:18.371 04:20:57 -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:18.371 04:20:57 -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:18.371 04:20:57 -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:18.371 04:20:57 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:18.371 04:20:57 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:18.371 04:20:57 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:18.371 04:20:57 -- paths/export.sh@5 -- # export PATH 00:04:18.371 04:20:57 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:18.371 04:20:57 -- nvmf/common.sh@51 -- # : 0 00:04:18.371 04:20:57 -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:04:18.371 04:20:57 -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:04:18.371 04:20:57 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:04:18.371 04:20:57 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:18.371 04:20:57 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:18.371 04:20:57 -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:04:18.372 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:04:18.372 04:20:57 -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:04:18.372 04:20:57 -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:04:18.372 04:20:57 -- nvmf/common.sh@55 -- # have_pci_nics=0 00:04:18.372 04:20:57 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:04:18.372 04:20:57 -- spdk/autotest.sh@32 -- # uname -s 00:04:18.372 04:20:57 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:04:18.372 04:20:57 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:04:18.372 04:20:57 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:04:18.372 04:20:57 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:04:18.372 04:20:57 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:04:18.372 04:20:57 -- spdk/autotest.sh@44 -- # modprobe nbd 00:04:18.372 04:20:57 -- spdk/autotest.sh@46 -- # type -P udevadm 00:04:18.372 04:20:57 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:04:18.372 04:20:57 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:04:18.372 04:20:57 -- spdk/autotest.sh@48 -- # udevadm_pid=84917 00:04:18.372 04:20:57 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:04:18.372 04:20:57 -- pm/common@17 -- # local monitor 00:04:18.372 04:20:57 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:18.372 04:20:57 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:18.372 04:20:57 -- pm/common@21 -- # date +%s 00:04:18.372 04:20:57 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:18.372 04:20:57 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:18.372 04:20:57 -- pm/common@21 -- # date +%s 00:04:18.372 04:20:57 -- pm/common@21 -- # date +%s 00:04:18.372 04:20:57 -- pm/common@25 -- # sleep 1 00:04:18.372 04:20:57 -- pm/common@21 -- # date +%s 00:04:18.372 04:20:57 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1731813657 00:04:18.372 04:20:57 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1731813657 00:04:18.372 04:20:57 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1731813657 00:04:18.372 04:20:57 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1731813657 00:04:18.633 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1731813657_collect-vmstat.pm.log 00:04:18.633 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1731813657_collect-cpu-load.pm.log 00:04:18.633 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1731813657_collect-cpu-temp.pm.log 00:04:18.633 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1731813657_collect-bmc-pm.bmc.pm.log 00:04:19.575 04:20:58 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:04:19.575 04:20:58 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:04:19.575 04:20:58 -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:19.575 04:20:58 -- common/autotest_common.sh@10 -- # set +x 00:04:19.575 04:20:58 -- spdk/autotest.sh@59 -- # create_test_list 00:04:19.575 04:20:58 -- common/autotest_common.sh@752 -- # xtrace_disable 00:04:19.575 04:20:58 -- common/autotest_common.sh@10 -- # set +x 00:04:19.575 04:20:58 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:04:19.576 04:20:58 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:19.576 04:20:58 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:19.576 04:20:58 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:04:19.576 04:20:58 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:19.576 04:20:58 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:04:19.576 04:20:58 -- common/autotest_common.sh@1457 -- # uname 00:04:19.576 04:20:58 -- common/autotest_common.sh@1457 -- # '[' Linux = FreeBSD ']' 00:04:19.576 04:20:58 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:04:19.576 04:20:58 -- common/autotest_common.sh@1477 -- # uname 00:04:19.576 04:20:58 -- common/autotest_common.sh@1477 -- # [[ Linux = FreeBSD ]] 00:04:19.576 04:20:58 -- spdk/autotest.sh@68 -- # [[ y == y ]] 00:04:19.576 04:20:58 -- spdk/autotest.sh@70 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:04:19.576 lcov: LCOV version 1.15 00:04:19.576 04:20:58 -- spdk/autotest.sh@72 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:04:27.713 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcno 00:04:30.256 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:04:35.545 04:21:14 -- spdk/autotest.sh@76 -- # timing_enter pre_cleanup 00:04:35.545 04:21:14 -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:35.545 04:21:14 -- common/autotest_common.sh@10 -- # set +x 00:04:35.545 04:21:14 -- spdk/autotest.sh@78 -- # rm -f 00:04:35.545 04:21:14 -- spdk/autotest.sh@81 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:38.853 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:04:38.853 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:04:38.853 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:04:38.853 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:04:38.853 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:04:38.853 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:04:38.853 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:04:38.853 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:04:38.853 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:04:39.114 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:04:39.114 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:04:39.114 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:04:39.114 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:04:39.114 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:04:39.114 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:04:39.114 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:04:39.114 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:04:39.114 04:21:17 -- spdk/autotest.sh@83 -- # get_zoned_devs 00:04:39.114 04:21:17 -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:04:39.114 04:21:17 -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:04:39.114 04:21:17 -- common/autotest_common.sh@1658 -- # local nvme bdf 00:04:39.114 04:21:17 -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:04:39.114 04:21:17 -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:04:39.114 04:21:17 -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:04:39.114 04:21:17 -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:39.114 04:21:17 -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:39.114 04:21:17 -- spdk/autotest.sh@85 -- # (( 0 > 0 )) 00:04:39.114 04:21:17 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:04:39.114 04:21:17 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:04:39.114 04:21:17 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme0n1 00:04:39.114 04:21:17 -- scripts/common.sh@381 -- # local block=/dev/nvme0n1 pt 00:04:39.114 04:21:17 -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:39.374 No valid GPT data, bailing 00:04:39.374 04:21:17 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:39.374 04:21:17 -- scripts/common.sh@394 -- # pt= 00:04:39.374 04:21:17 -- scripts/common.sh@395 -- # return 1 00:04:39.374 04:21:17 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:39.374 1+0 records in 00:04:39.374 1+0 records out 00:04:39.374 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00150848 s, 695 MB/s 00:04:39.374 04:21:17 -- spdk/autotest.sh@105 -- # sync 00:04:39.374 04:21:18 -- spdk/autotest.sh@107 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:39.374 04:21:18 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:39.374 04:21:18 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:47.513 04:21:25 -- spdk/autotest.sh@111 -- # uname -s 00:04:47.513 04:21:25 -- spdk/autotest.sh@111 -- # [[ Linux == Linux ]] 00:04:47.513 04:21:25 -- spdk/autotest.sh@111 -- # [[ 1 -eq 1 ]] 00:04:47.513 04:21:25 -- spdk/autotest.sh@112 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:47.513 04:21:25 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:47.513 04:21:25 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:47.513 04:21:25 -- common/autotest_common.sh@10 -- # set +x 00:04:47.513 ************************************ 00:04:47.513 START TEST setup.sh 00:04:47.513 ************************************ 00:04:47.513 04:21:25 setup.sh -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:47.513 * Looking for test storage... 00:04:47.513 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:47.513 04:21:25 setup.sh -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:47.513 04:21:25 setup.sh -- common/autotest_common.sh@1693 -- # lcov --version 00:04:47.513 04:21:25 setup.sh -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:47.513 04:21:25 setup.sh -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@336 -- # IFS=.-: 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@336 -- # read -ra ver1 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@337 -- # IFS=.-: 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@337 -- # read -ra ver2 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@338 -- # local 'op=<' 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@340 -- # ver1_l=2 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@341 -- # ver2_l=1 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@344 -- # case "$op" in 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@345 -- # : 1 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@365 -- # decimal 1 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@353 -- # local d=1 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@355 -- # echo 1 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@365 -- # ver1[v]=1 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@366 -- # decimal 2 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@353 -- # local d=2 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@355 -- # echo 2 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@366 -- # ver2[v]=2 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:47.513 04:21:25 setup.sh -- scripts/common.sh@368 -- # return 0 00:04:47.513 04:21:25 setup.sh -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:47.513 04:21:25 setup.sh -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:47.513 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.514 --rc genhtml_branch_coverage=1 00:04:47.514 --rc genhtml_function_coverage=1 00:04:47.514 --rc genhtml_legend=1 00:04:47.514 --rc geninfo_all_blocks=1 00:04:47.514 --rc geninfo_unexecuted_blocks=1 00:04:47.514 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.514 ' 00:04:47.514 04:21:25 setup.sh -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:47.514 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.514 --rc genhtml_branch_coverage=1 00:04:47.514 --rc genhtml_function_coverage=1 00:04:47.514 --rc genhtml_legend=1 00:04:47.514 --rc geninfo_all_blocks=1 00:04:47.514 --rc geninfo_unexecuted_blocks=1 00:04:47.514 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.514 ' 00:04:47.514 04:21:25 setup.sh -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:47.514 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.514 --rc genhtml_branch_coverage=1 00:04:47.514 --rc genhtml_function_coverage=1 00:04:47.514 --rc genhtml_legend=1 00:04:47.514 --rc geninfo_all_blocks=1 00:04:47.514 --rc geninfo_unexecuted_blocks=1 00:04:47.514 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.514 ' 00:04:47.514 04:21:25 setup.sh -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:47.514 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.514 --rc genhtml_branch_coverage=1 00:04:47.514 --rc genhtml_function_coverage=1 00:04:47.514 --rc genhtml_legend=1 00:04:47.514 --rc geninfo_all_blocks=1 00:04:47.514 --rc geninfo_unexecuted_blocks=1 00:04:47.514 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.514 ' 00:04:47.514 04:21:25 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:47.514 04:21:25 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:47.514 04:21:25 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:47.514 04:21:25 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:47.514 04:21:25 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:47.514 04:21:25 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:47.514 ************************************ 00:04:47.514 START TEST acl 00:04:47.514 ************************************ 00:04:47.514 04:21:25 setup.sh.acl -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:47.514 * Looking for test storage... 00:04:47.514 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:47.514 04:21:25 setup.sh.acl -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:47.514 04:21:25 setup.sh.acl -- common/autotest_common.sh@1693 -- # lcov --version 00:04:47.514 04:21:25 setup.sh.acl -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:47.514 04:21:25 setup.sh.acl -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@336 -- # IFS=.-: 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@336 -- # read -ra ver1 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@337 -- # IFS=.-: 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@337 -- # read -ra ver2 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@338 -- # local 'op=<' 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@340 -- # ver1_l=2 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@341 -- # ver2_l=1 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@344 -- # case "$op" in 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@345 -- # : 1 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@365 -- # decimal 1 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@353 -- # local d=1 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@355 -- # echo 1 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@365 -- # ver1[v]=1 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@366 -- # decimal 2 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@353 -- # local d=2 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@355 -- # echo 2 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@366 -- # ver2[v]=2 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:47.514 04:21:25 setup.sh.acl -- scripts/common.sh@368 -- # return 0 00:04:47.514 04:21:25 setup.sh.acl -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:47.514 04:21:25 setup.sh.acl -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:47.514 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.514 --rc genhtml_branch_coverage=1 00:04:47.514 --rc genhtml_function_coverage=1 00:04:47.514 --rc genhtml_legend=1 00:04:47.514 --rc geninfo_all_blocks=1 00:04:47.514 --rc geninfo_unexecuted_blocks=1 00:04:47.514 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.514 ' 00:04:47.514 04:21:25 setup.sh.acl -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:47.514 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.514 --rc genhtml_branch_coverage=1 00:04:47.514 --rc genhtml_function_coverage=1 00:04:47.514 --rc genhtml_legend=1 00:04:47.514 --rc geninfo_all_blocks=1 00:04:47.514 --rc geninfo_unexecuted_blocks=1 00:04:47.514 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.514 ' 00:04:47.514 04:21:25 setup.sh.acl -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:47.514 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.514 --rc genhtml_branch_coverage=1 00:04:47.514 --rc genhtml_function_coverage=1 00:04:47.514 --rc genhtml_legend=1 00:04:47.514 --rc geninfo_all_blocks=1 00:04:47.514 --rc geninfo_unexecuted_blocks=1 00:04:47.514 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.514 ' 00:04:47.514 04:21:25 setup.sh.acl -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:47.514 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.514 --rc genhtml_branch_coverage=1 00:04:47.514 --rc genhtml_function_coverage=1 00:04:47.514 --rc genhtml_legend=1 00:04:47.514 --rc geninfo_all_blocks=1 00:04:47.514 --rc geninfo_unexecuted_blocks=1 00:04:47.514 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.514 ' 00:04:47.514 04:21:25 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:47.514 04:21:25 setup.sh.acl -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:04:47.514 04:21:25 setup.sh.acl -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:04:47.514 04:21:25 setup.sh.acl -- common/autotest_common.sh@1658 -- # local nvme bdf 00:04:47.514 04:21:25 setup.sh.acl -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:04:47.514 04:21:25 setup.sh.acl -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:04:47.514 04:21:25 setup.sh.acl -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:04:47.514 04:21:25 setup.sh.acl -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:47.514 04:21:25 setup.sh.acl -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:47.514 04:21:25 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:47.514 04:21:25 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:47.514 04:21:25 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:47.514 04:21:25 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:47.514 04:21:25 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:47.514 04:21:25 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:47.514 04:21:25 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:51.720 04:21:29 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:51.720 04:21:29 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:51.720 04:21:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:51.720 04:21:29 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:51.720 04:21:29 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:51.720 04:21:29 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:55.017 Hugepages 00:04:55.017 node hugesize free / total 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.017 00:04:55.017 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.017 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:55.018 04:21:33 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:55.018 04:21:33 setup.sh.acl -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:55.018 04:21:33 setup.sh.acl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:55.018 04:21:33 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:55.018 ************************************ 00:04:55.018 START TEST denied 00:04:55.018 ************************************ 00:04:55.018 04:21:33 setup.sh.acl.denied -- common/autotest_common.sh@1129 -- # denied 00:04:55.018 04:21:33 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:04:55.018 04:21:33 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:55.018 04:21:33 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:04:55.018 04:21:33 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:55.018 04:21:33 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:59.223 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:04:59.223 04:21:37 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:04:59.223 04:21:37 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:59.223 04:21:37 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:59.223 04:21:37 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:04:59.223 04:21:37 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:04:59.223 04:21:37 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:59.223 04:21:37 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:59.223 04:21:37 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:59.223 04:21:37 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:59.223 04:21:37 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:03.427 00:05:03.427 real 0m8.508s 00:05:03.427 user 0m2.736s 00:05:03.427 sys 0m5.108s 00:05:03.427 04:21:41 setup.sh.acl.denied -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:03.427 04:21:41 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:05:03.427 ************************************ 00:05:03.427 END TEST denied 00:05:03.427 ************************************ 00:05:03.427 04:21:42 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:05:03.427 04:21:42 setup.sh.acl -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:03.427 04:21:42 setup.sh.acl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:03.427 04:21:42 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:03.427 ************************************ 00:05:03.427 START TEST allowed 00:05:03.427 ************************************ 00:05:03.427 04:21:42 setup.sh.acl.allowed -- common/autotest_common.sh@1129 -- # allowed 00:05:03.427 04:21:42 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:05:03.427 04:21:42 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:05:03.427 04:21:42 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:05:03.427 04:21:42 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:05:03.427 04:21:42 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:08.723 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:08.723 04:21:47 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:05:08.723 04:21:47 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:05:08.723 04:21:47 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:05:08.723 04:21:47 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:08.723 04:21:47 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:12.930 00:05:12.930 real 0m9.027s 00:05:12.930 user 0m2.589s 00:05:12.930 sys 0m5.137s 00:05:12.930 04:21:51 setup.sh.acl.allowed -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:12.930 04:21:51 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:05:12.930 ************************************ 00:05:12.930 END TEST allowed 00:05:12.930 ************************************ 00:05:12.930 00:05:12.930 real 0m25.404s 00:05:12.930 user 0m8.124s 00:05:12.930 sys 0m15.617s 00:05:12.930 04:21:51 setup.sh.acl -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:12.930 04:21:51 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:12.930 ************************************ 00:05:12.930 END TEST acl 00:05:12.930 ************************************ 00:05:12.930 04:21:51 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:05:12.930 04:21:51 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:12.930 04:21:51 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:12.930 04:21:51 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:12.930 ************************************ 00:05:12.930 START TEST hugepages 00:05:12.930 ************************************ 00:05:12.930 04:21:51 setup.sh.hugepages -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:05:12.930 * Looking for test storage... 00:05:12.930 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:12.930 04:21:51 setup.sh.hugepages -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # lcov --version 00:05:12.930 04:21:51 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:05:12.930 04:21:51 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@336 -- # IFS=.-: 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@336 -- # read -ra ver1 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@337 -- # IFS=.-: 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@337 -- # read -ra ver2 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@338 -- # local 'op=<' 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@340 -- # ver1_l=2 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@341 -- # ver2_l=1 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@344 -- # case "$op" in 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@345 -- # : 1 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@365 -- # decimal 1 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=1 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 1 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@365 -- # ver1[v]=1 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@366 -- # decimal 2 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=2 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 2 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@366 -- # ver2[v]=2 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:12.930 04:21:51 setup.sh.hugepages -- scripts/common.sh@368 -- # return 0 00:05:12.930 04:21:51 setup.sh.hugepages -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:12.930 04:21:51 setup.sh.hugepages -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:05:12.930 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.930 --rc genhtml_branch_coverage=1 00:05:12.930 --rc genhtml_function_coverage=1 00:05:12.930 --rc genhtml_legend=1 00:05:12.930 --rc geninfo_all_blocks=1 00:05:12.930 --rc geninfo_unexecuted_blocks=1 00:05:12.930 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.930 ' 00:05:12.930 04:21:51 setup.sh.hugepages -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:05:12.930 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.930 --rc genhtml_branch_coverage=1 00:05:12.930 --rc genhtml_function_coverage=1 00:05:12.930 --rc genhtml_legend=1 00:05:12.930 --rc geninfo_all_blocks=1 00:05:12.930 --rc geninfo_unexecuted_blocks=1 00:05:12.930 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.930 ' 00:05:12.930 04:21:51 setup.sh.hugepages -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:05:12.930 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.930 --rc genhtml_branch_coverage=1 00:05:12.930 --rc genhtml_function_coverage=1 00:05:12.930 --rc genhtml_legend=1 00:05:12.930 --rc geninfo_all_blocks=1 00:05:12.930 --rc geninfo_unexecuted_blocks=1 00:05:12.930 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.930 ' 00:05:12.930 04:21:51 setup.sh.hugepages -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:05:12.930 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.930 --rc genhtml_branch_coverage=1 00:05:12.930 --rc genhtml_function_coverage=1 00:05:12.930 --rc genhtml_legend=1 00:05:12.930 --rc geninfo_all_blocks=1 00:05:12.930 --rc geninfo_unexecuted_blocks=1 00:05:12.930 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.930 ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 39431932 kB' 'MemAvailable: 43115660 kB' 'Buffers: 8940 kB' 'Cached: 12476376 kB' 'SwapCached: 0 kB' 'Active: 9497624 kB' 'Inactive: 3663056 kB' 'Active(anon): 9093568 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 678872 kB' 'Mapped: 142268 kB' 'Shmem: 8418204 kB' 'KReclaimable: 226968 kB' 'Slab: 839324 kB' 'SReclaimable: 226968 kB' 'SUnreclaim: 612356 kB' 'KernelStack: 21744 kB' 'PageTables: 7860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433344 kB' 'Committed_AS: 10849112 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214112 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.930 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGEMEM 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGENODE 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v NRHUGE 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@197 -- # get_nodes 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@26 -- # local node 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@198 -- # clear_hp 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:05:12.931 04:21:51 setup.sh.hugepages -- setup/hugepages.sh@200 -- # run_test single_node_setup single_node_setup 00:05:12.931 04:21:51 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:12.931 04:21:51 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:12.931 04:21:51 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:12.931 ************************************ 00:05:12.931 START TEST single_node_setup 00:05:12.931 ************************************ 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1129 -- # single_node_setup 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@135 -- # get_test_nr_hugepages 2097152 0 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@48 -- # local size=2097152 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@50 -- # shift 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # node_ids=('0') 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # local node_ids 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # local user_nodes 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@72 -- # return 0 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # NRHUGE=1024 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # HUGENODE=0 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # setup output 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:05:12.931 04:21:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:16.232 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:16.232 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:16.232 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:16.232 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:16.232 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:16.232 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:16.232 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:16.493 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:16.493 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:16.493 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:16.493 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:16.493 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:16.493 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:16.493 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:16.493 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:16.493 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:18.413 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:18.413 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@137 -- # verify_nr_hugepages 00:05:18.413 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@88 -- # local node 00:05:18.413 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@89 -- # local sorted_t 00:05:18.413 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@90 -- # local sorted_s 00:05:18.413 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@91 -- # local surp 00:05:18.413 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@92 -- # local resv 00:05:18.413 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@93 -- # local anon 00:05:18.413 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:18.413 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41616500 kB' 'MemAvailable: 45299588 kB' 'Buffers: 8940 kB' 'Cached: 12476580 kB' 'SwapCached: 0 kB' 'Active: 9504412 kB' 'Inactive: 3663056 kB' 'Active(anon): 9100356 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 685220 kB' 'Mapped: 142240 kB' 'Shmem: 8418408 kB' 'KReclaimable: 225688 kB' 'Slab: 837928 kB' 'SReclaimable: 225688 kB' 'SUnreclaim: 612240 kB' 'KernelStack: 21792 kB' 'PageTables: 7724 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10850040 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214256 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.414 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # anon=0 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.415 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41617344 kB' 'MemAvailable: 45300424 kB' 'Buffers: 8940 kB' 'Cached: 12476584 kB' 'SwapCached: 0 kB' 'Active: 9504212 kB' 'Inactive: 3663056 kB' 'Active(anon): 9100156 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 685012 kB' 'Mapped: 142240 kB' 'Shmem: 8418412 kB' 'KReclaimable: 225672 kB' 'Slab: 837912 kB' 'SReclaimable: 225672 kB' 'SUnreclaim: 612240 kB' 'KernelStack: 21696 kB' 'PageTables: 7664 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10850056 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.416 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:18.417 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # surp=0 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41618244 kB' 'MemAvailable: 45301324 kB' 'Buffers: 8940 kB' 'Cached: 12476584 kB' 'SwapCached: 0 kB' 'Active: 9504968 kB' 'Inactive: 3663056 kB' 'Active(anon): 9100912 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 685764 kB' 'Mapped: 142240 kB' 'Shmem: 8418412 kB' 'KReclaimable: 225672 kB' 'Slab: 837972 kB' 'SReclaimable: 225672 kB' 'SUnreclaim: 612300 kB' 'KernelStack: 21840 kB' 'PageTables: 7540 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10848580 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.418 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.419 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # resv=0 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:18.420 nr_hugepages=1024 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:18.420 resv_hugepages=0 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:18.420 surplus_hugepages=0 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:18.420 anon_hugepages=0 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41617644 kB' 'MemAvailable: 45300724 kB' 'Buffers: 8940 kB' 'Cached: 12476624 kB' 'SwapCached: 0 kB' 'Active: 9504368 kB' 'Inactive: 3663056 kB' 'Active(anon): 9100312 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 685164 kB' 'Mapped: 142240 kB' 'Shmem: 8418452 kB' 'KReclaimable: 225672 kB' 'Slab: 837944 kB' 'SReclaimable: 225672 kB' 'SUnreclaim: 612272 kB' 'KernelStack: 21824 kB' 'PageTables: 7872 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10850104 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214384 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.420 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.421 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 1024 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@111 -- # get_nodes 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@26 -- # local node 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node=0 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 26112700 kB' 'MemUsed: 6472668 kB' 'SwapCached: 0 kB' 'Active: 3425160 kB' 'Inactive: 176840 kB' 'Active(anon): 3245196 kB' 'Inactive(anon): 0 kB' 'Active(file): 179964 kB' 'Inactive(file): 176840 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3013060 kB' 'Mapped: 55476 kB' 'AnonPages: 592040 kB' 'Shmem: 2656256 kB' 'KernelStack: 12536 kB' 'PageTables: 5268 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 79576 kB' 'Slab: 312704 kB' 'SReclaimable: 79576 kB' 'SUnreclaim: 233128 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.422 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:56 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:18.423 04:21:57 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:18.424 04:21:57 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:18.424 node0=1024 expecting 1024 00:05:18.424 04:21:57 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:18.424 00:05:18.424 real 0m5.460s 00:05:18.424 user 0m1.452s 00:05:18.424 sys 0m2.514s 00:05:18.424 04:21:57 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:18.424 04:21:57 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@10 -- # set +x 00:05:18.424 ************************************ 00:05:18.424 END TEST single_node_setup 00:05:18.424 ************************************ 00:05:18.424 04:21:57 setup.sh.hugepages -- setup/hugepages.sh@201 -- # run_test even_2G_alloc even_2G_alloc 00:05:18.424 04:21:57 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:18.424 04:21:57 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:18.424 04:21:57 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:18.424 ************************************ 00:05:18.424 START TEST even_2G_alloc 00:05:18.424 ************************************ 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1129 -- # even_2G_alloc 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@142 -- # get_test_nr_hugepages 2097152 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 512 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # NRHUGE=1024 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # setup output 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:18.424 04:21:57 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:21.724 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:21.724 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:21.724 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:21.724 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:21.724 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:21.724 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:21.724 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:21.724 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:21.724 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:21.724 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:21.724 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:21.724 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:21.724 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:21.724 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:21.724 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:21.724 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:21.724 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@144 -- # verify_nr_hugepages 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@88 -- # local node 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41581088 kB' 'MemAvailable: 45264192 kB' 'Buffers: 8940 kB' 'Cached: 12476828 kB' 'SwapCached: 0 kB' 'Active: 9508728 kB' 'Inactive: 3663056 kB' 'Active(anon): 9104672 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 689004 kB' 'Mapped: 141368 kB' 'Shmem: 8418656 kB' 'KReclaimable: 225720 kB' 'Slab: 837464 kB' 'SReclaimable: 225720 kB' 'SUnreclaim: 611744 kB' 'KernelStack: 21696 kB' 'PageTables: 7600 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10842396 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.992 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.993 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41581568 kB' 'MemAvailable: 45264672 kB' 'Buffers: 8940 kB' 'Cached: 12476832 kB' 'SwapCached: 0 kB' 'Active: 9508640 kB' 'Inactive: 3663056 kB' 'Active(anon): 9104584 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 688948 kB' 'Mapped: 141336 kB' 'Shmem: 8418660 kB' 'KReclaimable: 225720 kB' 'Slab: 837492 kB' 'SReclaimable: 225720 kB' 'SUnreclaim: 611772 kB' 'KernelStack: 21632 kB' 'PageTables: 7400 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10840996 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214208 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.994 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.995 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41580812 kB' 'MemAvailable: 45263912 kB' 'Buffers: 8940 kB' 'Cached: 12476832 kB' 'SwapCached: 0 kB' 'Active: 9508092 kB' 'Inactive: 3663056 kB' 'Active(anon): 9104036 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 688828 kB' 'Mapped: 141260 kB' 'Shmem: 8418660 kB' 'KReclaimable: 225712 kB' 'Slab: 837476 kB' 'SReclaimable: 225712 kB' 'SUnreclaim: 611764 kB' 'KernelStack: 21680 kB' 'PageTables: 7528 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10841016 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214208 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.996 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.997 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:21.998 nr_hugepages=1024 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:21.998 resv_hugepages=0 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:21.998 surplus_hugepages=0 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:21.998 anon_hugepages=0 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41580748 kB' 'MemAvailable: 45263848 kB' 'Buffers: 8940 kB' 'Cached: 12476872 kB' 'SwapCached: 0 kB' 'Active: 9507652 kB' 'Inactive: 3663056 kB' 'Active(anon): 9103596 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 688300 kB' 'Mapped: 141260 kB' 'Shmem: 8418700 kB' 'KReclaimable: 225712 kB' 'Slab: 837476 kB' 'SReclaimable: 225712 kB' 'SUnreclaim: 611764 kB' 'KernelStack: 21648 kB' 'PageTables: 7408 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10840672 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214176 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.998 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:21.999 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@26 -- # local node 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 27161836 kB' 'MemUsed: 5423532 kB' 'SwapCached: 0 kB' 'Active: 3427552 kB' 'Inactive: 176840 kB' 'Active(anon): 3247588 kB' 'Inactive(anon): 0 kB' 'Active(file): 179964 kB' 'Inactive(file): 176840 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3013180 kB' 'Mapped: 55220 kB' 'AnonPages: 594476 kB' 'Shmem: 2656376 kB' 'KernelStack: 12360 kB' 'PageTables: 5024 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 79624 kB' 'Slab: 312192 kB' 'SReclaimable: 79624 kB' 'SUnreclaim: 232568 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.000 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698416 kB' 'MemFree: 14419120 kB' 'MemUsed: 13279296 kB' 'SwapCached: 0 kB' 'Active: 6079916 kB' 'Inactive: 3486216 kB' 'Active(anon): 5855824 kB' 'Inactive(anon): 0 kB' 'Active(file): 224092 kB' 'Inactive(file): 3486216 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9472676 kB' 'Mapped: 86040 kB' 'AnonPages: 93588 kB' 'Shmem: 5762368 kB' 'KernelStack: 9272 kB' 'PageTables: 2304 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 146088 kB' 'Slab: 525284 kB' 'SReclaimable: 146088 kB' 'SUnreclaim: 379196 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.001 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:22.002 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:22.003 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:22.003 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:22.003 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:22.003 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:22.003 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:22.003 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:22.003 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:22.003 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:22.003 node0=512 expecting 512 00:05:22.003 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:22.003 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:22.003 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:22.003 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:22.003 node1=512 expecting 512 00:05:22.003 04:22:00 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@129 -- # [[ 512 == \5\1\2 ]] 00:05:22.003 00:05:22.003 real 0m3.704s 00:05:22.003 user 0m1.419s 00:05:22.003 sys 0m2.356s 00:05:22.003 04:22:00 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:22.003 04:22:00 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:22.003 ************************************ 00:05:22.003 END TEST even_2G_alloc 00:05:22.003 ************************************ 00:05:22.264 04:22:00 setup.sh.hugepages -- setup/hugepages.sh@202 -- # run_test odd_alloc odd_alloc 00:05:22.264 04:22:00 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:22.264 04:22:00 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:22.264 04:22:00 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:22.264 ************************************ 00:05:22.264 START TEST odd_alloc 00:05:22.264 ************************************ 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1129 -- # odd_alloc 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@149 -- # get_test_nr_hugepages 2098176 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@48 -- # local size=2098176 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1025 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1025 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 513 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=513 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # HUGEMEM=2049 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # setup output 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:22.264 04:22:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:25.561 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:25.561 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:25.561 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:25.561 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:25.561 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:25.561 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:25.561 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:25.561 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:25.561 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:25.561 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:25.561 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:25.561 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:25.561 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:25.561 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:25.561 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:25.562 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:25.562 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@151 -- # verify_nr_hugepages 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@88 -- # local node 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41589636 kB' 'MemAvailable: 45272688 kB' 'Buffers: 8940 kB' 'Cached: 12477116 kB' 'SwapCached: 0 kB' 'Active: 9505464 kB' 'Inactive: 3663056 kB' 'Active(anon): 9101408 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 685808 kB' 'Mapped: 141408 kB' 'Shmem: 8418944 kB' 'KReclaimable: 225616 kB' 'Slab: 837996 kB' 'SReclaimable: 225616 kB' 'SUnreclaim: 612380 kB' 'KernelStack: 21696 kB' 'PageTables: 7612 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480896 kB' 'Committed_AS: 10842120 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.829 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41590388 kB' 'MemAvailable: 45273440 kB' 'Buffers: 8940 kB' 'Cached: 12477120 kB' 'SwapCached: 0 kB' 'Active: 9506872 kB' 'Inactive: 3663056 kB' 'Active(anon): 9102816 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 687220 kB' 'Mapped: 141780 kB' 'Shmem: 8418948 kB' 'KReclaimable: 225616 kB' 'Slab: 837972 kB' 'SReclaimable: 225616 kB' 'SUnreclaim: 612356 kB' 'KernelStack: 21648 kB' 'PageTables: 7440 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480896 kB' 'Committed_AS: 10845076 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214240 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.830 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.831 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41586608 kB' 'MemAvailable: 45269660 kB' 'Buffers: 8940 kB' 'Cached: 12477120 kB' 'SwapCached: 0 kB' 'Active: 9510076 kB' 'Inactive: 3663056 kB' 'Active(anon): 9106020 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 690928 kB' 'Mapped: 141780 kB' 'Shmem: 8418948 kB' 'KReclaimable: 225616 kB' 'Slab: 837972 kB' 'SReclaimable: 225616 kB' 'SUnreclaim: 612356 kB' 'KernelStack: 21680 kB' 'PageTables: 7536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480896 kB' 'Committed_AS: 10848276 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214244 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.832 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.833 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1025 00:05:25.834 nr_hugepages=1025 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:25.834 resv_hugepages=0 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:25.834 surplus_hugepages=0 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:25.834 anon_hugepages=0 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@106 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@108 -- # (( 1025 == nr_hugepages )) 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41586600 kB' 'MemAvailable: 45269652 kB' 'Buffers: 8940 kB' 'Cached: 12477156 kB' 'SwapCached: 0 kB' 'Active: 9505168 kB' 'Inactive: 3663056 kB' 'Active(anon): 9101112 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 685444 kB' 'Mapped: 141556 kB' 'Shmem: 8418984 kB' 'KReclaimable: 225616 kB' 'Slab: 837972 kB' 'SReclaimable: 225616 kB' 'SUnreclaim: 612356 kB' 'KernelStack: 21680 kB' 'PageTables: 7560 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480896 kB' 'Committed_AS: 10842620 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.834 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.835 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@26 -- # local node 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=513 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 27143060 kB' 'MemUsed: 5442308 kB' 'SwapCached: 0 kB' 'Active: 3427768 kB' 'Inactive: 176840 kB' 'Active(anon): 3247804 kB' 'Inactive(anon): 0 kB' 'Active(file): 179964 kB' 'Inactive(file): 176840 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3013340 kB' 'Mapped: 55724 kB' 'AnonPages: 594476 kB' 'Shmem: 2656536 kB' 'KernelStack: 12376 kB' 'PageTables: 5040 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 79528 kB' 'Slab: 312524 kB' 'SReclaimable: 79528 kB' 'SUnreclaim: 232996 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.836 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.837 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698416 kB' 'MemFree: 14443432 kB' 'MemUsed: 13254984 kB' 'SwapCached: 0 kB' 'Active: 6079192 kB' 'Inactive: 3486216 kB' 'Active(anon): 5855100 kB' 'Inactive(anon): 0 kB' 'Active(file): 224092 kB' 'Inactive(file): 3486216 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9472800 kB' 'Mapped: 86208 kB' 'AnonPages: 92696 kB' 'Shmem: 5762492 kB' 'KernelStack: 9256 kB' 'PageTables: 2348 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 146088 kB' 'Slab: 525448 kB' 'SReclaimable: 146088 kB' 'SUnreclaim: 379360 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.838 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node0=513 expecting 513' 00:05:25.839 node0=513 expecting 513 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:25.839 node1=512 expecting 512 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@129 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:05:25.839 00:05:25.839 real 0m3.686s 00:05:25.839 user 0m1.351s 00:05:25.839 sys 0m2.393s 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:25.839 04:22:04 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:25.839 ************************************ 00:05:25.839 END TEST odd_alloc 00:05:25.839 ************************************ 00:05:25.839 04:22:04 setup.sh.hugepages -- setup/hugepages.sh@203 -- # run_test custom_alloc custom_alloc 00:05:25.839 04:22:04 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:25.839 04:22:04 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:25.839 04:22:04 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:25.839 ************************************ 00:05:25.839 START TEST custom_alloc 00:05:25.839 ************************************ 00:05:25.839 04:22:04 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1129 -- # custom_alloc 00:05:25.839 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@157 -- # local IFS=, 00:05:25.839 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@159 -- # local node 00:05:25.839 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # nodes_hp=() 00:05:25.839 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # local nodes_hp 00:05:25.839 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@162 -- # local nr_hugepages=0 _nr_hugepages=0 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@164 -- # get_test_nr_hugepages 1048576 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=1048576 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=512 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=512 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 256 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@165 -- # nodes_hp[0]=512 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@166 -- # (( 2 > 1 )) 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # get_test_nr_hugepages 2097152 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 1 > 0 )) 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@168 -- # nodes_hp[1]=1024 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # get_test_nr_hugepages_per_node 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 2 > 0 )) 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=1024 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # setup output 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:26.100 04:22:04 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:29.401 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:29.401 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:29.401 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:29.401 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:29.401 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:29.401 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:29.401 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:29.401 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:29.401 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:29.401 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:29.401 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:29.401 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:29.401 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:29.401 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:29.401 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:29.401 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:29.401 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nr_hugepages=1536 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # verify_nr_hugepages 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@88 -- # local node 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 40574404 kB' 'MemAvailable: 44257456 kB' 'Buffers: 8940 kB' 'Cached: 12477364 kB' 'SwapCached: 0 kB' 'Active: 9511104 kB' 'Inactive: 3663056 kB' 'Active(anon): 9107048 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 689952 kB' 'Mapped: 141412 kB' 'Shmem: 8419192 kB' 'KReclaimable: 225616 kB' 'Slab: 838040 kB' 'SReclaimable: 225616 kB' 'SUnreclaim: 612424 kB' 'KernelStack: 21808 kB' 'PageTables: 7740 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957632 kB' 'Committed_AS: 10845568 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214464 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.401 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.402 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 40581728 kB' 'MemAvailable: 44264780 kB' 'Buffers: 8940 kB' 'Cached: 12477368 kB' 'SwapCached: 0 kB' 'Active: 9510776 kB' 'Inactive: 3663056 kB' 'Active(anon): 9106720 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 690768 kB' 'Mapped: 141288 kB' 'Shmem: 8419196 kB' 'KReclaimable: 225616 kB' 'Slab: 837992 kB' 'SReclaimable: 225616 kB' 'SUnreclaim: 612376 kB' 'KernelStack: 21824 kB' 'PageTables: 7632 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957632 kB' 'Committed_AS: 10845588 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214432 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.671 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.672 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 40583108 kB' 'MemAvailable: 44266160 kB' 'Buffers: 8940 kB' 'Cached: 12477384 kB' 'SwapCached: 0 kB' 'Active: 9510752 kB' 'Inactive: 3663056 kB' 'Active(anon): 9106696 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 690680 kB' 'Mapped: 141288 kB' 'Shmem: 8419212 kB' 'KReclaimable: 225616 kB' 'Slab: 838016 kB' 'SReclaimable: 225616 kB' 'SUnreclaim: 612400 kB' 'KernelStack: 21776 kB' 'PageTables: 7876 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957632 kB' 'Committed_AS: 10844108 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214432 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.673 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.674 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1536 00:05:29.675 nr_hugepages=1536 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:29.675 resv_hugepages=0 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:29.675 surplus_hugepages=0 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:29.675 anon_hugepages=0 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@106 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@108 -- # (( 1536 == nr_hugepages )) 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 40582348 kB' 'MemAvailable: 44265400 kB' 'Buffers: 8940 kB' 'Cached: 12477408 kB' 'SwapCached: 0 kB' 'Active: 9510472 kB' 'Inactive: 3663056 kB' 'Active(anon): 9106416 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 690380 kB' 'Mapped: 141288 kB' 'Shmem: 8419236 kB' 'KReclaimable: 225616 kB' 'Slab: 838016 kB' 'SReclaimable: 225616 kB' 'SUnreclaim: 612400 kB' 'KernelStack: 21600 kB' 'PageTables: 7116 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957632 kB' 'Committed_AS: 10843000 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.675 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.676 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@26 -- # local node 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 27163184 kB' 'MemUsed: 5422184 kB' 'SwapCached: 0 kB' 'Active: 3431924 kB' 'Inactive: 176840 kB' 'Active(anon): 3251960 kB' 'Inactive(anon): 0 kB' 'Active(file): 179964 kB' 'Inactive(file): 176840 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3013528 kB' 'Mapped: 55220 kB' 'AnonPages: 598408 kB' 'Shmem: 2656724 kB' 'KernelStack: 12424 kB' 'PageTables: 5236 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 79528 kB' 'Slab: 312352 kB' 'SReclaimable: 79528 kB' 'SUnreclaim: 232824 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.677 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.678 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698416 kB' 'MemFree: 13418912 kB' 'MemUsed: 14279504 kB' 'SwapCached: 0 kB' 'Active: 6078328 kB' 'Inactive: 3486216 kB' 'Active(anon): 5854236 kB' 'Inactive(anon): 0 kB' 'Active(file): 224092 kB' 'Inactive(file): 3486216 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9472840 kB' 'Mapped: 86064 kB' 'AnonPages: 91732 kB' 'Shmem: 5762532 kB' 'KernelStack: 9240 kB' 'PageTables: 2260 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 146088 kB' 'Slab: 525640 kB' 'SReclaimable: 146088 kB' 'SUnreclaim: 379552 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.679 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:29.680 node0=512 expecting 512 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node1=1024 expecting 1024' 00:05:29.680 node1=1024 expecting 1024 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@129 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:29.680 00:05:29.680 real 0m3.751s 00:05:29.680 user 0m1.380s 00:05:29.680 sys 0m2.437s 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:29.680 04:22:08 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:29.680 ************************************ 00:05:29.680 END TEST custom_alloc 00:05:29.680 ************************************ 00:05:29.680 04:22:08 setup.sh.hugepages -- setup/hugepages.sh@204 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:29.680 04:22:08 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:29.680 04:22:08 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:29.680 04:22:08 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:29.680 ************************************ 00:05:29.680 START TEST no_shrink_alloc 00:05:29.680 ************************************ 00:05:29.680 04:22:08 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1129 -- # no_shrink_alloc 00:05:29.680 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@185 -- # get_test_nr_hugepages 2097152 0 00:05:29.680 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:29.680 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:05:29.680 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # shift 00:05:29.680 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # node_ids=('0') 00:05:29.680 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # local node_ids 00:05:29.680 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:29.680 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:29.680 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:05:29.680 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:05:29.680 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:29.680 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:29.680 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:29.680 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:29.680 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:29.680 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:05:29.680 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:05:29.680 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:05:29.940 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@72 -- # return 0 00:05:29.940 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # NRHUGE=1024 00:05:29.940 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # HUGENODE=0 00:05:29.940 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # setup output 00:05:29.940 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:29.940 04:22:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:33.239 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:33.239 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:33.239 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:33.239 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:33.239 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:33.239 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:33.239 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:33.239 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:33.239 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:33.239 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:33.239 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:33.239 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:33.239 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:33.239 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:33.239 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:33.239 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:33.239 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@189 -- # verify_nr_hugepages 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41643468 kB' 'MemAvailable: 45326540 kB' 'Buffers: 8940 kB' 'Cached: 12477624 kB' 'SwapCached: 0 kB' 'Active: 9513264 kB' 'Inactive: 3663056 kB' 'Active(anon): 9109208 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 693112 kB' 'Mapped: 141332 kB' 'Shmem: 8419452 kB' 'KReclaimable: 225656 kB' 'Slab: 838784 kB' 'SReclaimable: 225656 kB' 'SUnreclaim: 613128 kB' 'KernelStack: 21712 kB' 'PageTables: 7564 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10843732 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.239 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.240 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.506 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41643432 kB' 'MemAvailable: 45326504 kB' 'Buffers: 8940 kB' 'Cached: 12477656 kB' 'SwapCached: 0 kB' 'Active: 9513248 kB' 'Inactive: 3663056 kB' 'Active(anon): 9109192 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 693032 kB' 'Mapped: 141300 kB' 'Shmem: 8419484 kB' 'KReclaimable: 225656 kB' 'Slab: 839100 kB' 'SReclaimable: 225656 kB' 'SUnreclaim: 613444 kB' 'KernelStack: 21712 kB' 'PageTables: 7544 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10843752 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214256 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.507 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.508 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41642524 kB' 'MemAvailable: 45325596 kB' 'Buffers: 8940 kB' 'Cached: 12477660 kB' 'SwapCached: 0 kB' 'Active: 9514076 kB' 'Inactive: 3663056 kB' 'Active(anon): 9110020 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 693952 kB' 'Mapped: 141300 kB' 'Shmem: 8419488 kB' 'KReclaimable: 225656 kB' 'Slab: 839104 kB' 'SReclaimable: 225656 kB' 'SUnreclaim: 613448 kB' 'KernelStack: 21712 kB' 'PageTables: 7556 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10843404 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214240 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.509 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.510 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:33.511 nr_hugepages=1024 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:33.511 resv_hugepages=0 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:33.511 surplus_hugepages=0 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:33.511 anon_hugepages=0 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41642452 kB' 'MemAvailable: 45325524 kB' 'Buffers: 8940 kB' 'Cached: 12477660 kB' 'SwapCached: 0 kB' 'Active: 9514036 kB' 'Inactive: 3663056 kB' 'Active(anon): 9109980 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 693912 kB' 'Mapped: 141300 kB' 'Shmem: 8419488 kB' 'KReclaimable: 225656 kB' 'Slab: 839064 kB' 'SReclaimable: 225656 kB' 'SUnreclaim: 613408 kB' 'KernelStack: 21696 kB' 'PageTables: 7476 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10843432 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214240 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.511 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.512 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 26113056 kB' 'MemUsed: 6472312 kB' 'SwapCached: 0 kB' 'Active: 3434656 kB' 'Inactive: 176840 kB' 'Active(anon): 3254692 kB' 'Inactive(anon): 0 kB' 'Active(file): 179964 kB' 'Inactive(file): 176840 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3013724 kB' 'Mapped: 55220 kB' 'AnonPages: 600984 kB' 'Shmem: 2656920 kB' 'KernelStack: 12408 kB' 'PageTables: 5152 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 79568 kB' 'Slab: 312740 kB' 'SReclaimable: 79568 kB' 'SUnreclaim: 233172 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.513 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:33.514 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:33.515 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:33.515 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:33.515 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:33.515 node0=1024 expecting 1024 00:05:33.515 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:33.515 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # CLEAR_HUGE=no 00:05:33.515 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # NRHUGE=512 00:05:33.515 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # HUGENODE=0 00:05:33.515 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # setup output 00:05:33.515 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:33.515 04:22:12 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:36.814 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:36.814 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:36.814 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:36.814 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:36.814 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:36.814 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:36.814 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:36.814 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:36.814 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:36.814 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:36.814 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:36.814 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:36.814 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:36.814 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:36.814 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:36.814 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:36.814 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:36.814 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@194 -- # verify_nr_hugepages 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41642492 kB' 'MemAvailable: 45325564 kB' 'Buffers: 8940 kB' 'Cached: 12477884 kB' 'SwapCached: 0 kB' 'Active: 9519356 kB' 'Inactive: 3663056 kB' 'Active(anon): 9115300 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 698464 kB' 'Mapped: 141380 kB' 'Shmem: 8419712 kB' 'KReclaimable: 225656 kB' 'Slab: 839028 kB' 'SReclaimable: 225656 kB' 'SUnreclaim: 613372 kB' 'KernelStack: 21792 kB' 'PageTables: 7540 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10847676 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214384 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.081 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.082 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41644432 kB' 'MemAvailable: 45327504 kB' 'Buffers: 8940 kB' 'Cached: 12477888 kB' 'SwapCached: 0 kB' 'Active: 9518712 kB' 'Inactive: 3663056 kB' 'Active(anon): 9114656 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 698208 kB' 'Mapped: 141300 kB' 'Shmem: 8419716 kB' 'KReclaimable: 225656 kB' 'Slab: 839020 kB' 'SReclaimable: 225656 kB' 'SUnreclaim: 613364 kB' 'KernelStack: 21712 kB' 'PageTables: 7588 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10847692 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214384 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.083 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.084 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41644376 kB' 'MemAvailable: 45327448 kB' 'Buffers: 8940 kB' 'Cached: 12477908 kB' 'SwapCached: 0 kB' 'Active: 9518828 kB' 'Inactive: 3663056 kB' 'Active(anon): 9114772 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 698360 kB' 'Mapped: 141300 kB' 'Shmem: 8419736 kB' 'KReclaimable: 225656 kB' 'Slab: 839020 kB' 'SReclaimable: 225656 kB' 'SUnreclaim: 613364 kB' 'KernelStack: 21776 kB' 'PageTables: 7624 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10847716 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214416 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.085 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.086 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:37.087 nr_hugepages=1024 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:37.087 resv_hugepages=0 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:37.087 surplus_hugepages=0 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:37.087 anon_hugepages=0 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283784 kB' 'MemFree: 41643824 kB' 'MemAvailable: 45326896 kB' 'Buffers: 8940 kB' 'Cached: 12477908 kB' 'SwapCached: 0 kB' 'Active: 9518828 kB' 'Inactive: 3663056 kB' 'Active(anon): 9114772 kB' 'Inactive(anon): 0 kB' 'Active(file): 404056 kB' 'Inactive(file): 3663056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 698288 kB' 'Mapped: 141304 kB' 'Shmem: 8419736 kB' 'KReclaimable: 225656 kB' 'Slab: 839084 kB' 'SReclaimable: 225656 kB' 'SUnreclaim: 613428 kB' 'KernelStack: 21744 kB' 'PageTables: 7692 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481920 kB' 'Committed_AS: 10847736 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214416 kB' 'VmallocChunk: 0 kB' 'Percpu: 73920 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 558452 kB' 'DirectMap2M: 9613312 kB' 'DirectMap1G: 59768832 kB' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.087 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.088 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 26107580 kB' 'MemUsed: 6477788 kB' 'SwapCached: 0 kB' 'Active: 3438292 kB' 'Inactive: 176840 kB' 'Active(anon): 3258328 kB' 'Inactive(anon): 0 kB' 'Active(file): 179964 kB' 'Inactive(file): 176840 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3013840 kB' 'Mapped: 55220 kB' 'AnonPages: 604392 kB' 'Shmem: 2657036 kB' 'KernelStack: 12472 kB' 'PageTables: 5232 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 79568 kB' 'Slab: 312800 kB' 'SReclaimable: 79568 kB' 'SUnreclaim: 233232 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.089 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:37.090 node0=1024 expecting 1024 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:37.090 00:05:37.090 real 0m7.377s 00:05:37.090 user 0m2.752s 00:05:37.090 sys 0m4.755s 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:37.090 04:22:15 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:37.090 ************************************ 00:05:37.090 END TEST no_shrink_alloc 00:05:37.090 ************************************ 00:05:37.351 04:22:15 setup.sh.hugepages -- setup/hugepages.sh@206 -- # clear_hp 00:05:37.351 04:22:15 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:05:37.351 04:22:15 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:37.351 04:22:15 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:37.351 04:22:15 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:37.351 04:22:15 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:37.351 04:22:15 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:37.351 04:22:15 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:37.351 04:22:15 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:37.351 04:22:15 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:37.351 04:22:15 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:37.351 04:22:15 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:37.351 04:22:15 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:05:37.351 04:22:15 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:05:37.351 00:05:37.351 real 0m24.675s 00:05:37.351 user 0m8.641s 00:05:37.351 sys 0m14.915s 00:05:37.351 04:22:15 setup.sh.hugepages -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:37.351 04:22:15 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:37.351 ************************************ 00:05:37.351 END TEST hugepages 00:05:37.351 ************************************ 00:05:37.351 04:22:15 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:37.351 04:22:15 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:37.351 04:22:15 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:37.351 04:22:15 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:37.351 ************************************ 00:05:37.351 START TEST driver 00:05:37.351 ************************************ 00:05:37.351 04:22:16 setup.sh.driver -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:37.351 * Looking for test storage... 00:05:37.351 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:37.351 04:22:16 setup.sh.driver -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:05:37.351 04:22:16 setup.sh.driver -- common/autotest_common.sh@1693 -- # lcov --version 00:05:37.351 04:22:16 setup.sh.driver -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:05:37.612 04:22:16 setup.sh.driver -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@336 -- # IFS=.-: 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@336 -- # read -ra ver1 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@337 -- # IFS=.-: 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@337 -- # read -ra ver2 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@338 -- # local 'op=<' 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@340 -- # ver1_l=2 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@341 -- # ver2_l=1 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@344 -- # case "$op" in 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@345 -- # : 1 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@365 -- # decimal 1 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@353 -- # local d=1 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@355 -- # echo 1 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@365 -- # ver1[v]=1 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@366 -- # decimal 2 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@353 -- # local d=2 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@355 -- # echo 2 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@366 -- # ver2[v]=2 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:37.612 04:22:16 setup.sh.driver -- scripts/common.sh@368 -- # return 0 00:05:37.612 04:22:16 setup.sh.driver -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:37.612 04:22:16 setup.sh.driver -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:05:37.612 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.612 --rc genhtml_branch_coverage=1 00:05:37.612 --rc genhtml_function_coverage=1 00:05:37.612 --rc genhtml_legend=1 00:05:37.612 --rc geninfo_all_blocks=1 00:05:37.612 --rc geninfo_unexecuted_blocks=1 00:05:37.612 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.612 ' 00:05:37.612 04:22:16 setup.sh.driver -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:05:37.612 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.612 --rc genhtml_branch_coverage=1 00:05:37.612 --rc genhtml_function_coverage=1 00:05:37.612 --rc genhtml_legend=1 00:05:37.612 --rc geninfo_all_blocks=1 00:05:37.612 --rc geninfo_unexecuted_blocks=1 00:05:37.612 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.612 ' 00:05:37.612 04:22:16 setup.sh.driver -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:05:37.612 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.612 --rc genhtml_branch_coverage=1 00:05:37.612 --rc genhtml_function_coverage=1 00:05:37.612 --rc genhtml_legend=1 00:05:37.612 --rc geninfo_all_blocks=1 00:05:37.612 --rc geninfo_unexecuted_blocks=1 00:05:37.612 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.612 ' 00:05:37.612 04:22:16 setup.sh.driver -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:05:37.612 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.612 --rc genhtml_branch_coverage=1 00:05:37.612 --rc genhtml_function_coverage=1 00:05:37.612 --rc genhtml_legend=1 00:05:37.612 --rc geninfo_all_blocks=1 00:05:37.612 --rc geninfo_unexecuted_blocks=1 00:05:37.612 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.612 ' 00:05:37.612 04:22:16 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:05:37.612 04:22:16 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:37.612 04:22:16 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:42.900 04:22:21 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:42.900 04:22:21 setup.sh.driver -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:42.900 04:22:21 setup.sh.driver -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:42.900 04:22:21 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:42.900 ************************************ 00:05:42.900 START TEST guess_driver 00:05:42.900 ************************************ 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- common/autotest_common.sh@1129 -- # guess_driver 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:42.900 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:42.900 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:42.900 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:42.900 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:42.900 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:42.900 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:42.900 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:42.900 Looking for driver=vfio-pci 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:05:42.900 04:22:21 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.202 04:22:24 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:48.140 04:22:26 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:48.140 04:22:26 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:48.140 04:22:26 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:48.140 04:22:26 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:48.140 04:22:26 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:48.140 04:22:26 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:48.140 04:22:26 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:53.429 00:05:53.429 real 0m10.292s 00:05:53.429 user 0m2.728s 00:05:53.429 sys 0m5.228s 00:05:53.429 04:22:31 setup.sh.driver.guess_driver -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:53.429 04:22:31 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:53.429 ************************************ 00:05:53.429 END TEST guess_driver 00:05:53.429 ************************************ 00:05:53.429 00:05:53.429 real 0m15.560s 00:05:53.429 user 0m4.258s 00:05:53.429 sys 0m8.143s 00:05:53.429 04:22:31 setup.sh.driver -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:53.429 04:22:31 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:53.429 ************************************ 00:05:53.429 END TEST driver 00:05:53.429 ************************************ 00:05:53.429 04:22:31 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:53.429 04:22:31 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:53.429 04:22:31 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:53.429 04:22:31 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:53.429 ************************************ 00:05:53.429 START TEST devices 00:05:53.429 ************************************ 00:05:53.429 04:22:31 setup.sh.devices -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:53.429 * Looking for test storage... 00:05:53.429 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:53.429 04:22:31 setup.sh.devices -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:05:53.429 04:22:31 setup.sh.devices -- common/autotest_common.sh@1693 -- # lcov --version 00:05:53.429 04:22:31 setup.sh.devices -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:05:53.429 04:22:31 setup.sh.devices -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:05:53.429 04:22:31 setup.sh.devices -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:53.429 04:22:31 setup.sh.devices -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:53.429 04:22:31 setup.sh.devices -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:53.429 04:22:31 setup.sh.devices -- scripts/common.sh@336 -- # IFS=.-: 00:05:53.429 04:22:31 setup.sh.devices -- scripts/common.sh@336 -- # read -ra ver1 00:05:53.429 04:22:31 setup.sh.devices -- scripts/common.sh@337 -- # IFS=.-: 00:05:53.429 04:22:31 setup.sh.devices -- scripts/common.sh@337 -- # read -ra ver2 00:05:53.429 04:22:31 setup.sh.devices -- scripts/common.sh@338 -- # local 'op=<' 00:05:53.429 04:22:31 setup.sh.devices -- scripts/common.sh@340 -- # ver1_l=2 00:05:53.429 04:22:31 setup.sh.devices -- scripts/common.sh@341 -- # ver2_l=1 00:05:53.429 04:22:31 setup.sh.devices -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:53.429 04:22:31 setup.sh.devices -- scripts/common.sh@344 -- # case "$op" in 00:05:53.430 04:22:31 setup.sh.devices -- scripts/common.sh@345 -- # : 1 00:05:53.430 04:22:31 setup.sh.devices -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:53.430 04:22:31 setup.sh.devices -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:53.430 04:22:31 setup.sh.devices -- scripts/common.sh@365 -- # decimal 1 00:05:53.430 04:22:31 setup.sh.devices -- scripts/common.sh@353 -- # local d=1 00:05:53.430 04:22:31 setup.sh.devices -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:53.430 04:22:31 setup.sh.devices -- scripts/common.sh@355 -- # echo 1 00:05:53.430 04:22:31 setup.sh.devices -- scripts/common.sh@365 -- # ver1[v]=1 00:05:53.430 04:22:31 setup.sh.devices -- scripts/common.sh@366 -- # decimal 2 00:05:53.430 04:22:31 setup.sh.devices -- scripts/common.sh@353 -- # local d=2 00:05:53.430 04:22:31 setup.sh.devices -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:53.430 04:22:31 setup.sh.devices -- scripts/common.sh@355 -- # echo 2 00:05:53.430 04:22:31 setup.sh.devices -- scripts/common.sh@366 -- # ver2[v]=2 00:05:53.430 04:22:31 setup.sh.devices -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:53.430 04:22:31 setup.sh.devices -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:53.430 04:22:31 setup.sh.devices -- scripts/common.sh@368 -- # return 0 00:05:53.430 04:22:31 setup.sh.devices -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:53.430 04:22:31 setup.sh.devices -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:05:53.430 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:53.430 --rc genhtml_branch_coverage=1 00:05:53.430 --rc genhtml_function_coverage=1 00:05:53.430 --rc genhtml_legend=1 00:05:53.430 --rc geninfo_all_blocks=1 00:05:53.430 --rc geninfo_unexecuted_blocks=1 00:05:53.430 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:53.430 ' 00:05:53.430 04:22:31 setup.sh.devices -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:05:53.430 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:53.430 --rc genhtml_branch_coverage=1 00:05:53.430 --rc genhtml_function_coverage=1 00:05:53.430 --rc genhtml_legend=1 00:05:53.430 --rc geninfo_all_blocks=1 00:05:53.430 --rc geninfo_unexecuted_blocks=1 00:05:53.430 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:53.430 ' 00:05:53.430 04:22:31 setup.sh.devices -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:05:53.430 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:53.430 --rc genhtml_branch_coverage=1 00:05:53.430 --rc genhtml_function_coverage=1 00:05:53.430 --rc genhtml_legend=1 00:05:53.430 --rc geninfo_all_blocks=1 00:05:53.430 --rc geninfo_unexecuted_blocks=1 00:05:53.430 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:53.430 ' 00:05:53.430 04:22:31 setup.sh.devices -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:05:53.430 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:53.430 --rc genhtml_branch_coverage=1 00:05:53.430 --rc genhtml_function_coverage=1 00:05:53.430 --rc genhtml_legend=1 00:05:53.430 --rc geninfo_all_blocks=1 00:05:53.430 --rc geninfo_unexecuted_blocks=1 00:05:53.430 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:53.430 ' 00:05:53.430 04:22:31 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:53.430 04:22:31 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:53.430 04:22:31 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:53.430 04:22:31 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:57.633 04:22:35 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:57.633 04:22:35 setup.sh.devices -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:05:57.633 04:22:35 setup.sh.devices -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:05:57.633 04:22:35 setup.sh.devices -- common/autotest_common.sh@1658 -- # local nvme bdf 00:05:57.633 04:22:35 setup.sh.devices -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:05:57.633 04:22:35 setup.sh.devices -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:05:57.633 04:22:35 setup.sh.devices -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:05:57.633 04:22:35 setup.sh.devices -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:57.633 04:22:35 setup.sh.devices -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:05:57.633 04:22:35 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:57.633 04:22:35 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:57.633 04:22:35 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:57.633 04:22:35 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:57.633 04:22:35 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:57.633 04:22:35 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:57.633 04:22:35 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:57.633 04:22:35 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:57.633 04:22:35 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:05:57.633 04:22:35 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:05:57.633 04:22:35 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:57.633 04:22:35 setup.sh.devices -- scripts/common.sh@381 -- # local block=nvme0n1 pt 00:05:57.633 04:22:35 setup.sh.devices -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:57.633 No valid GPT data, bailing 00:05:57.633 04:22:35 setup.sh.devices -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:57.633 04:22:35 setup.sh.devices -- scripts/common.sh@394 -- # pt= 00:05:57.633 04:22:35 setup.sh.devices -- scripts/common.sh@395 -- # return 1 00:05:57.633 04:22:35 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:57.633 04:22:35 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:57.633 04:22:35 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:57.633 04:22:35 setup.sh.devices -- setup/common.sh@80 -- # echo 1600321314816 00:05:57.633 04:22:35 setup.sh.devices -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:05:57.633 04:22:35 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:57.633 04:22:35 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:05:57.633 04:22:35 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:57.633 04:22:35 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:57.633 04:22:35 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:57.633 04:22:35 setup.sh.devices -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:57.633 04:22:35 setup.sh.devices -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:57.633 04:22:35 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:57.633 ************************************ 00:05:57.633 START TEST nvme_mount 00:05:57.633 ************************************ 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1129 -- # nvme_mount 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:57.633 04:22:35 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:58.204 Creating new GPT entries in memory. 00:05:58.204 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:58.204 other utilities. 00:05:58.204 04:22:36 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:58.204 04:22:36 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:58.204 04:22:36 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:58.204 04:22:36 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:58.204 04:22:36 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:59.147 Creating new GPT entries in memory. 00:05:59.147 The operation has completed successfully. 00:05:59.147 04:22:37 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:59.147 04:22:37 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:59.147 04:22:37 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 117899 00:05:59.407 04:22:37 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:59.407 04:22:37 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:59.407 04:22:37 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:59.407 04:22:38 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:59.407 04:22:38 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:59.407 04:22:38 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:59.407 04:22:38 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:59.407 04:22:38 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:59.407 04:22:38 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:59.407 04:22:38 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:59.407 04:22:38 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:59.407 04:22:38 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:59.407 04:22:38 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:59.407 04:22:38 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:59.408 04:22:38 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:59.408 04:22:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.408 04:22:38 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:59.408 04:22:38 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:59.408 04:22:38 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:59.408 04:22:38 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.706 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:06:02.707 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:02.707 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.967 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:02.967 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:02.967 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:02.967 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:02.967 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:02.967 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:06:02.967 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:02.967 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:02.967 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:02.967 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:02.967 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:02.967 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:02.967 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:03.228 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:03.228 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:06:03.228 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:03.228 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:03.228 04:22:41 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:06.527 04:22:45 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.827 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.088 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:10.088 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:06:10.088 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:10.088 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.088 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:10.088 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:10.088 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:06:10.088 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:06:10.088 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:10.088 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:10.088 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:10.088 04:22:48 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:10.088 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:10.088 00:06:10.088 real 0m12.988s 00:06:10.088 user 0m3.733s 00:06:10.088 sys 0m7.195s 00:06:10.088 04:22:48 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:10.088 04:22:48 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:06:10.088 ************************************ 00:06:10.088 END TEST nvme_mount 00:06:10.088 ************************************ 00:06:10.088 04:22:48 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:06:10.088 04:22:48 setup.sh.devices -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:10.088 04:22:48 setup.sh.devices -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:10.088 04:22:48 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:10.349 ************************************ 00:06:10.349 START TEST dm_mount 00:06:10.349 ************************************ 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- common/autotest_common.sh@1129 -- # dm_mount 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:06:10.349 04:22:48 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:06:11.290 Creating new GPT entries in memory. 00:06:11.290 GPT data structures destroyed! You may now partition the disk using fdisk or 00:06:11.290 other utilities. 00:06:11.290 04:22:49 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:06:11.290 04:22:49 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:11.290 04:22:49 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:11.290 04:22:49 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:11.290 04:22:49 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:06:12.230 Creating new GPT entries in memory. 00:06:12.230 The operation has completed successfully. 00:06:12.230 04:22:50 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:12.230 04:22:50 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:12.230 04:22:50 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:12.230 04:22:50 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:12.230 04:22:50 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:06:13.614 The operation has completed successfully. 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 122457 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:13.614 04:22:52 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:16.914 04:22:55 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:20.213 04:22:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.485 04:22:59 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:20.486 04:22:59 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:20.486 04:22:59 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:06:20.486 04:22:59 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:06:20.486 04:22:59 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:20.486 04:22:59 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:20.486 04:22:59 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:06:20.486 04:22:59 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:20.486 04:22:59 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:06:20.486 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:20.486 04:22:59 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:20.486 04:22:59 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:06:20.486 00:06:20.486 real 0m10.269s 00:06:20.486 user 0m2.521s 00:06:20.486 sys 0m4.811s 00:06:20.486 04:22:59 setup.sh.devices.dm_mount -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:20.486 04:22:59 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:06:20.486 ************************************ 00:06:20.486 END TEST dm_mount 00:06:20.486 ************************************ 00:06:20.486 04:22:59 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:06:20.486 04:22:59 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:06:20.487 04:22:59 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:20.487 04:22:59 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:20.487 04:22:59 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:20.487 04:22:59 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:20.487 04:22:59 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:20.750 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:20.750 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:06:20.750 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:20.750 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:20.750 04:22:59 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:06:20.750 04:22:59 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:20.750 04:22:59 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:20.750 04:22:59 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:20.751 04:22:59 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:20.751 04:22:59 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:06:20.751 04:22:59 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:06:20.751 00:06:20.751 real 0m27.893s 00:06:20.751 user 0m7.878s 00:06:20.751 sys 0m14.949s 00:06:20.751 04:22:59 setup.sh.devices -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:20.751 04:22:59 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:20.751 ************************************ 00:06:20.751 END TEST devices 00:06:20.751 ************************************ 00:06:21.011 00:06:21.011 real 1m34.081s 00:06:21.011 user 0m29.133s 00:06:21.011 sys 0m53.985s 00:06:21.011 04:22:59 setup.sh -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:21.011 04:22:59 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:21.011 ************************************ 00:06:21.011 END TEST setup.sh 00:06:21.011 ************************************ 00:06:21.011 04:22:59 -- spdk/autotest.sh@115 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:06:24.310 Hugepages 00:06:24.310 node hugesize free / total 00:06:24.310 node0 1048576kB 0 / 0 00:06:24.310 node0 2048kB 1024 / 1024 00:06:24.310 node1 1048576kB 0 / 0 00:06:24.310 node1 2048kB 1024 / 1024 00:06:24.310 00:06:24.310 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:24.310 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:06:24.310 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:06:24.310 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:06:24.310 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:06:24.310 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:06:24.310 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:06:24.310 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:06:24.310 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:06:24.310 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:06:24.310 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:06:24.310 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:06:24.310 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:06:24.310 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:06:24.310 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:06:24.310 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:06:24.310 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:06:24.571 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:06:24.571 04:23:03 -- spdk/autotest.sh@117 -- # uname -s 00:06:24.571 04:23:03 -- spdk/autotest.sh@117 -- # [[ Linux == Linux ]] 00:06:24.571 04:23:03 -- spdk/autotest.sh@119 -- # nvme_namespace_revert 00:06:24.571 04:23:03 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:27.869 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:27.869 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:27.869 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:27.869 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:27.869 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:27.869 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:27.869 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:27.869 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:27.869 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:27.869 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:28.130 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:28.130 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:28.130 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:28.130 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:28.130 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:28.130 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:29.576 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:29.836 04:23:08 -- common/autotest_common.sh@1517 -- # sleep 1 00:06:30.779 04:23:09 -- common/autotest_common.sh@1518 -- # bdfs=() 00:06:30.779 04:23:09 -- common/autotest_common.sh@1518 -- # local bdfs 00:06:30.779 04:23:09 -- common/autotest_common.sh@1520 -- # bdfs=($(get_nvme_bdfs)) 00:06:30.779 04:23:09 -- common/autotest_common.sh@1520 -- # get_nvme_bdfs 00:06:30.779 04:23:09 -- common/autotest_common.sh@1498 -- # bdfs=() 00:06:30.779 04:23:09 -- common/autotest_common.sh@1498 -- # local bdfs 00:06:30.779 04:23:09 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:30.779 04:23:09 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:30.779 04:23:09 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:06:31.039 04:23:09 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:06:31.039 04:23:09 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:06:31.039 04:23:09 -- common/autotest_common.sh@1522 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:06:34.336 Waiting for block devices as requested 00:06:34.336 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:34.336 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:34.336 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:34.597 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:34.597 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:34.597 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:34.857 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:34.857 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:34.857 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:35.118 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:35.118 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:35.118 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:35.378 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:35.378 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:35.378 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:35.638 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:35.638 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:06:35.899 04:23:14 -- common/autotest_common.sh@1524 -- # for bdf in "${bdfs[@]}" 00:06:35.899 04:23:14 -- common/autotest_common.sh@1525 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:06:35.899 04:23:14 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:06:35.899 04:23:14 -- common/autotest_common.sh@1487 -- # grep 0000:d8:00.0/nvme/nvme 00:06:35.899 04:23:14 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:35.899 04:23:14 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:06:35.899 04:23:14 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:35.899 04:23:14 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:06:35.899 04:23:14 -- common/autotest_common.sh@1525 -- # nvme_ctrlr=/dev/nvme0 00:06:35.899 04:23:14 -- common/autotest_common.sh@1526 -- # [[ -z /dev/nvme0 ]] 00:06:35.899 04:23:14 -- common/autotest_common.sh@1531 -- # nvme id-ctrl /dev/nvme0 00:06:35.899 04:23:14 -- common/autotest_common.sh@1531 -- # grep oacs 00:06:35.899 04:23:14 -- common/autotest_common.sh@1531 -- # cut -d: -f2 00:06:35.899 04:23:14 -- common/autotest_common.sh@1531 -- # oacs=' 0xe' 00:06:35.899 04:23:14 -- common/autotest_common.sh@1532 -- # oacs_ns_manage=8 00:06:35.899 04:23:14 -- common/autotest_common.sh@1534 -- # [[ 8 -ne 0 ]] 00:06:35.899 04:23:14 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:06:35.899 04:23:14 -- common/autotest_common.sh@1540 -- # grep unvmcap 00:06:35.899 04:23:14 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:06:35.899 04:23:14 -- common/autotest_common.sh@1540 -- # unvmcap=' 0' 00:06:35.899 04:23:14 -- common/autotest_common.sh@1541 -- # [[ 0 -eq 0 ]] 00:06:35.899 04:23:14 -- common/autotest_common.sh@1543 -- # continue 00:06:35.899 04:23:14 -- spdk/autotest.sh@122 -- # timing_exit pre_cleanup 00:06:35.899 04:23:14 -- common/autotest_common.sh@732 -- # xtrace_disable 00:06:35.899 04:23:14 -- common/autotest_common.sh@10 -- # set +x 00:06:35.899 04:23:14 -- spdk/autotest.sh@125 -- # timing_enter afterboot 00:06:35.899 04:23:14 -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:35.899 04:23:14 -- common/autotest_common.sh@10 -- # set +x 00:06:35.899 04:23:14 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:39.201 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:39.201 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:39.461 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:39.461 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:39.462 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:39.462 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:39.462 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:39.462 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:39.462 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:39.462 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:39.462 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:39.462 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:39.462 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:39.462 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:39.462 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:39.462 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:41.374 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:41.374 04:23:19 -- spdk/autotest.sh@127 -- # timing_exit afterboot 00:06:41.374 04:23:19 -- common/autotest_common.sh@732 -- # xtrace_disable 00:06:41.374 04:23:19 -- common/autotest_common.sh@10 -- # set +x 00:06:41.374 04:23:19 -- spdk/autotest.sh@131 -- # opal_revert_cleanup 00:06:41.374 04:23:19 -- common/autotest_common.sh@1578 -- # mapfile -t bdfs 00:06:41.374 04:23:19 -- common/autotest_common.sh@1578 -- # get_nvme_bdfs_by_id 0x0a54 00:06:41.374 04:23:19 -- common/autotest_common.sh@1563 -- # bdfs=() 00:06:41.374 04:23:19 -- common/autotest_common.sh@1563 -- # _bdfs=() 00:06:41.374 04:23:19 -- common/autotest_common.sh@1563 -- # local bdfs _bdfs 00:06:41.374 04:23:19 -- common/autotest_common.sh@1564 -- # _bdfs=($(get_nvme_bdfs)) 00:06:41.375 04:23:19 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:06:41.375 04:23:19 -- common/autotest_common.sh@1498 -- # bdfs=() 00:06:41.375 04:23:19 -- common/autotest_common.sh@1498 -- # local bdfs 00:06:41.375 04:23:19 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:41.375 04:23:19 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:41.375 04:23:19 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:06:41.375 04:23:20 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:06:41.375 04:23:20 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:06:41.375 04:23:20 -- common/autotest_common.sh@1565 -- # for bdf in "${_bdfs[@]}" 00:06:41.375 04:23:20 -- common/autotest_common.sh@1566 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:06:41.375 04:23:20 -- common/autotest_common.sh@1566 -- # device=0x0a54 00:06:41.375 04:23:20 -- common/autotest_common.sh@1567 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:06:41.375 04:23:20 -- common/autotest_common.sh@1568 -- # bdfs+=($bdf) 00:06:41.375 04:23:20 -- common/autotest_common.sh@1572 -- # (( 1 > 0 )) 00:06:41.375 04:23:20 -- common/autotest_common.sh@1573 -- # printf '%s\n' 0000:d8:00.0 00:06:41.375 04:23:20 -- common/autotest_common.sh@1579 -- # [[ -z 0000:d8:00.0 ]] 00:06:41.375 04:23:20 -- common/autotest_common.sh@1584 -- # spdk_tgt_pid=132372 00:06:41.375 04:23:20 -- common/autotest_common.sh@1583 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:41.375 04:23:20 -- common/autotest_common.sh@1585 -- # waitforlisten 132372 00:06:41.375 04:23:20 -- common/autotest_common.sh@835 -- # '[' -z 132372 ']' 00:06:41.375 04:23:20 -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:41.375 04:23:20 -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:41.375 04:23:20 -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:41.375 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:41.375 04:23:20 -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:41.375 04:23:20 -- common/autotest_common.sh@10 -- # set +x 00:06:41.375 [2024-11-17 04:23:20.091313] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:06:41.375 [2024-11-17 04:23:20.091370] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid132372 ] 00:06:41.375 [2024-11-17 04:23:20.178241] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:41.375 [2024-11-17 04:23:20.201520] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.635 04:23:20 -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:41.635 04:23:20 -- common/autotest_common.sh@868 -- # return 0 00:06:41.635 04:23:20 -- common/autotest_common.sh@1587 -- # bdf_id=0 00:06:41.635 04:23:20 -- common/autotest_common.sh@1588 -- # for bdf in "${bdfs[@]}" 00:06:41.635 04:23:20 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:06:44.934 nvme0n1 00:06:44.934 04:23:23 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:06:44.934 [2024-11-17 04:23:23.603026] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:06:44.934 request: 00:06:44.934 { 00:06:44.934 "nvme_ctrlr_name": "nvme0", 00:06:44.934 "password": "test", 00:06:44.934 "method": "bdev_nvme_opal_revert", 00:06:44.934 "req_id": 1 00:06:44.934 } 00:06:44.934 Got JSON-RPC error response 00:06:44.934 response: 00:06:44.934 { 00:06:44.934 "code": -32602, 00:06:44.934 "message": "Invalid parameters" 00:06:44.934 } 00:06:44.934 04:23:23 -- common/autotest_common.sh@1591 -- # true 00:06:44.934 04:23:23 -- common/autotest_common.sh@1592 -- # (( ++bdf_id )) 00:06:44.934 04:23:23 -- common/autotest_common.sh@1595 -- # killprocess 132372 00:06:44.934 04:23:23 -- common/autotest_common.sh@954 -- # '[' -z 132372 ']' 00:06:44.934 04:23:23 -- common/autotest_common.sh@958 -- # kill -0 132372 00:06:44.934 04:23:23 -- common/autotest_common.sh@959 -- # uname 00:06:44.934 04:23:23 -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:44.934 04:23:23 -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 132372 00:06:44.934 04:23:23 -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:44.934 04:23:23 -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:44.934 04:23:23 -- common/autotest_common.sh@972 -- # echo 'killing process with pid 132372' 00:06:44.934 killing process with pid 132372 00:06:44.935 04:23:23 -- common/autotest_common.sh@973 -- # kill 132372 00:06:44.935 04:23:23 -- common/autotest_common.sh@978 -- # wait 132372 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.935 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.196 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.196 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.196 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.196 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.196 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.196 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.196 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.196 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.196 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.196 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.196 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.196 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.196 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.196 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.196 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.196 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.196 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.196 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:45.197 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:47.108 04:23:25 -- spdk/autotest.sh@137 -- # '[' 0 -eq 1 ']' 00:06:47.108 04:23:25 -- spdk/autotest.sh@141 -- # '[' 1 -eq 1 ']' 00:06:47.108 04:23:25 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:47.108 04:23:25 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:47.108 04:23:25 -- spdk/autotest.sh@149 -- # timing_enter lib 00:06:47.108 04:23:25 -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:47.108 04:23:25 -- common/autotest_common.sh@10 -- # set +x 00:06:47.108 04:23:25 -- spdk/autotest.sh@151 -- # [[ 0 -eq 1 ]] 00:06:47.108 04:23:25 -- spdk/autotest.sh@155 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:47.108 04:23:25 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:47.108 04:23:25 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:47.108 04:23:25 -- common/autotest_common.sh@10 -- # set +x 00:06:47.108 ************************************ 00:06:47.108 START TEST env 00:06:47.108 ************************************ 00:06:47.108 04:23:25 env -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:47.108 * Looking for test storage... 00:06:47.368 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:06:47.369 04:23:25 env -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:47.369 04:23:25 env -- common/autotest_common.sh@1693 -- # lcov --version 00:06:47.369 04:23:25 env -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:47.369 04:23:26 env -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:47.369 04:23:26 env -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:47.369 04:23:26 env -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:47.369 04:23:26 env -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:47.369 04:23:26 env -- scripts/common.sh@336 -- # IFS=.-: 00:06:47.369 04:23:26 env -- scripts/common.sh@336 -- # read -ra ver1 00:06:47.369 04:23:26 env -- scripts/common.sh@337 -- # IFS=.-: 00:06:47.369 04:23:26 env -- scripts/common.sh@337 -- # read -ra ver2 00:06:47.369 04:23:26 env -- scripts/common.sh@338 -- # local 'op=<' 00:06:47.369 04:23:26 env -- scripts/common.sh@340 -- # ver1_l=2 00:06:47.369 04:23:26 env -- scripts/common.sh@341 -- # ver2_l=1 00:06:47.369 04:23:26 env -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:47.369 04:23:26 env -- scripts/common.sh@344 -- # case "$op" in 00:06:47.369 04:23:26 env -- scripts/common.sh@345 -- # : 1 00:06:47.369 04:23:26 env -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:47.369 04:23:26 env -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:47.369 04:23:26 env -- scripts/common.sh@365 -- # decimal 1 00:06:47.369 04:23:26 env -- scripts/common.sh@353 -- # local d=1 00:06:47.369 04:23:26 env -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:47.369 04:23:26 env -- scripts/common.sh@355 -- # echo 1 00:06:47.369 04:23:26 env -- scripts/common.sh@365 -- # ver1[v]=1 00:06:47.369 04:23:26 env -- scripts/common.sh@366 -- # decimal 2 00:06:47.369 04:23:26 env -- scripts/common.sh@353 -- # local d=2 00:06:47.369 04:23:26 env -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:47.369 04:23:26 env -- scripts/common.sh@355 -- # echo 2 00:06:47.369 04:23:26 env -- scripts/common.sh@366 -- # ver2[v]=2 00:06:47.369 04:23:26 env -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:47.369 04:23:26 env -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:47.369 04:23:26 env -- scripts/common.sh@368 -- # return 0 00:06:47.369 04:23:26 env -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:47.369 04:23:26 env -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:47.369 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.369 --rc genhtml_branch_coverage=1 00:06:47.369 --rc genhtml_function_coverage=1 00:06:47.369 --rc genhtml_legend=1 00:06:47.369 --rc geninfo_all_blocks=1 00:06:47.369 --rc geninfo_unexecuted_blocks=1 00:06:47.369 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.369 ' 00:06:47.369 04:23:26 env -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:47.369 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.369 --rc genhtml_branch_coverage=1 00:06:47.369 --rc genhtml_function_coverage=1 00:06:47.369 --rc genhtml_legend=1 00:06:47.369 --rc geninfo_all_blocks=1 00:06:47.369 --rc geninfo_unexecuted_blocks=1 00:06:47.369 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.369 ' 00:06:47.369 04:23:26 env -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:47.369 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.369 --rc genhtml_branch_coverage=1 00:06:47.369 --rc genhtml_function_coverage=1 00:06:47.369 --rc genhtml_legend=1 00:06:47.369 --rc geninfo_all_blocks=1 00:06:47.369 --rc geninfo_unexecuted_blocks=1 00:06:47.369 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.369 ' 00:06:47.369 04:23:26 env -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:47.369 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.369 --rc genhtml_branch_coverage=1 00:06:47.369 --rc genhtml_function_coverage=1 00:06:47.369 --rc genhtml_legend=1 00:06:47.369 --rc geninfo_all_blocks=1 00:06:47.369 --rc geninfo_unexecuted_blocks=1 00:06:47.369 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.369 ' 00:06:47.369 04:23:26 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:47.369 04:23:26 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:47.369 04:23:26 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:47.369 04:23:26 env -- common/autotest_common.sh@10 -- # set +x 00:06:47.369 ************************************ 00:06:47.369 START TEST env_memory 00:06:47.369 ************************************ 00:06:47.369 04:23:26 env.env_memory -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:47.369 00:06:47.369 00:06:47.369 CUnit - A unit testing framework for C - Version 2.1-3 00:06:47.369 http://cunit.sourceforge.net/ 00:06:47.369 00:06:47.369 00:06:47.369 Suite: memory 00:06:47.369 Test: alloc and free memory map ...[2024-11-17 04:23:26.088606] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:47.369 passed 00:06:47.369 Test: mem map translation ...[2024-11-17 04:23:26.101467] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:47.369 [2024-11-17 04:23:26.101484] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:47.369 [2024-11-17 04:23:26.101516] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 589:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:47.369 [2024-11-17 04:23:26.101529] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 605:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:47.369 passed 00:06:47.369 Test: mem map registration ...[2024-11-17 04:23:26.122905] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=200000 len=1234 00:06:47.369 [2024-11-17 04:23:26.122920] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=4d2 len=2097152 00:06:47.369 passed 00:06:47.369 Test: mem map adjacent registrations ...passed 00:06:47.369 00:06:47.369 Run Summary: Type Total Ran Passed Failed Inactive 00:06:47.369 suites 1 1 n/a 0 0 00:06:47.369 tests 4 4 4 0 0 00:06:47.369 asserts 152 152 152 0 n/a 00:06:47.369 00:06:47.369 Elapsed time = 0.076 seconds 00:06:47.369 00:06:47.369 real 0m0.085s 00:06:47.369 user 0m0.073s 00:06:47.369 sys 0m0.012s 00:06:47.369 04:23:26 env.env_memory -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:47.369 04:23:26 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:47.369 ************************************ 00:06:47.369 END TEST env_memory 00:06:47.369 ************************************ 00:06:47.369 04:23:26 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:47.369 04:23:26 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:47.369 04:23:26 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:47.369 04:23:26 env -- common/autotest_common.sh@10 -- # set +x 00:06:47.629 ************************************ 00:06:47.630 START TEST env_vtophys 00:06:47.630 ************************************ 00:06:47.630 04:23:26 env.env_vtophys -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:47.630 EAL: lib.eal log level changed from notice to debug 00:06:47.630 EAL: Detected lcore 0 as core 0 on socket 0 00:06:47.630 EAL: Detected lcore 1 as core 1 on socket 0 00:06:47.630 EAL: Detected lcore 2 as core 2 on socket 0 00:06:47.630 EAL: Detected lcore 3 as core 3 on socket 0 00:06:47.630 EAL: Detected lcore 4 as core 4 on socket 0 00:06:47.630 EAL: Detected lcore 5 as core 5 on socket 0 00:06:47.630 EAL: Detected lcore 6 as core 6 on socket 0 00:06:47.630 EAL: Detected lcore 7 as core 8 on socket 0 00:06:47.630 EAL: Detected lcore 8 as core 9 on socket 0 00:06:47.630 EAL: Detected lcore 9 as core 10 on socket 0 00:06:47.630 EAL: Detected lcore 10 as core 11 on socket 0 00:06:47.630 EAL: Detected lcore 11 as core 12 on socket 0 00:06:47.630 EAL: Detected lcore 12 as core 13 on socket 0 00:06:47.630 EAL: Detected lcore 13 as core 14 on socket 0 00:06:47.630 EAL: Detected lcore 14 as core 16 on socket 0 00:06:47.630 EAL: Detected lcore 15 as core 17 on socket 0 00:06:47.630 EAL: Detected lcore 16 as core 18 on socket 0 00:06:47.630 EAL: Detected lcore 17 as core 19 on socket 0 00:06:47.630 EAL: Detected lcore 18 as core 20 on socket 0 00:06:47.630 EAL: Detected lcore 19 as core 21 on socket 0 00:06:47.630 EAL: Detected lcore 20 as core 22 on socket 0 00:06:47.630 EAL: Detected lcore 21 as core 24 on socket 0 00:06:47.630 EAL: Detected lcore 22 as core 25 on socket 0 00:06:47.630 EAL: Detected lcore 23 as core 26 on socket 0 00:06:47.630 EAL: Detected lcore 24 as core 27 on socket 0 00:06:47.630 EAL: Detected lcore 25 as core 28 on socket 0 00:06:47.630 EAL: Detected lcore 26 as core 29 on socket 0 00:06:47.630 EAL: Detected lcore 27 as core 30 on socket 0 00:06:47.630 EAL: Detected lcore 28 as core 0 on socket 1 00:06:47.630 EAL: Detected lcore 29 as core 1 on socket 1 00:06:47.630 EAL: Detected lcore 30 as core 2 on socket 1 00:06:47.630 EAL: Detected lcore 31 as core 3 on socket 1 00:06:47.630 EAL: Detected lcore 32 as core 4 on socket 1 00:06:47.630 EAL: Detected lcore 33 as core 5 on socket 1 00:06:47.630 EAL: Detected lcore 34 as core 6 on socket 1 00:06:47.630 EAL: Detected lcore 35 as core 8 on socket 1 00:06:47.630 EAL: Detected lcore 36 as core 9 on socket 1 00:06:47.630 EAL: Detected lcore 37 as core 10 on socket 1 00:06:47.630 EAL: Detected lcore 38 as core 11 on socket 1 00:06:47.630 EAL: Detected lcore 39 as core 12 on socket 1 00:06:47.630 EAL: Detected lcore 40 as core 13 on socket 1 00:06:47.630 EAL: Detected lcore 41 as core 14 on socket 1 00:06:47.630 EAL: Detected lcore 42 as core 16 on socket 1 00:06:47.630 EAL: Detected lcore 43 as core 17 on socket 1 00:06:47.630 EAL: Detected lcore 44 as core 18 on socket 1 00:06:47.630 EAL: Detected lcore 45 as core 19 on socket 1 00:06:47.630 EAL: Detected lcore 46 as core 20 on socket 1 00:06:47.630 EAL: Detected lcore 47 as core 21 on socket 1 00:06:47.630 EAL: Detected lcore 48 as core 22 on socket 1 00:06:47.630 EAL: Detected lcore 49 as core 24 on socket 1 00:06:47.630 EAL: Detected lcore 50 as core 25 on socket 1 00:06:47.630 EAL: Detected lcore 51 as core 26 on socket 1 00:06:47.630 EAL: Detected lcore 52 as core 27 on socket 1 00:06:47.630 EAL: Detected lcore 53 as core 28 on socket 1 00:06:47.630 EAL: Detected lcore 54 as core 29 on socket 1 00:06:47.630 EAL: Detected lcore 55 as core 30 on socket 1 00:06:47.630 EAL: Detected lcore 56 as core 0 on socket 0 00:06:47.630 EAL: Detected lcore 57 as core 1 on socket 0 00:06:47.630 EAL: Detected lcore 58 as core 2 on socket 0 00:06:47.630 EAL: Detected lcore 59 as core 3 on socket 0 00:06:47.630 EAL: Detected lcore 60 as core 4 on socket 0 00:06:47.630 EAL: Detected lcore 61 as core 5 on socket 0 00:06:47.630 EAL: Detected lcore 62 as core 6 on socket 0 00:06:47.630 EAL: Detected lcore 63 as core 8 on socket 0 00:06:47.630 EAL: Detected lcore 64 as core 9 on socket 0 00:06:47.630 EAL: Detected lcore 65 as core 10 on socket 0 00:06:47.630 EAL: Detected lcore 66 as core 11 on socket 0 00:06:47.630 EAL: Detected lcore 67 as core 12 on socket 0 00:06:47.630 EAL: Detected lcore 68 as core 13 on socket 0 00:06:47.630 EAL: Detected lcore 69 as core 14 on socket 0 00:06:47.630 EAL: Detected lcore 70 as core 16 on socket 0 00:06:47.630 EAL: Detected lcore 71 as core 17 on socket 0 00:06:47.630 EAL: Detected lcore 72 as core 18 on socket 0 00:06:47.630 EAL: Detected lcore 73 as core 19 on socket 0 00:06:47.630 EAL: Detected lcore 74 as core 20 on socket 0 00:06:47.630 EAL: Detected lcore 75 as core 21 on socket 0 00:06:47.630 EAL: Detected lcore 76 as core 22 on socket 0 00:06:47.630 EAL: Detected lcore 77 as core 24 on socket 0 00:06:47.630 EAL: Detected lcore 78 as core 25 on socket 0 00:06:47.630 EAL: Detected lcore 79 as core 26 on socket 0 00:06:47.630 EAL: Detected lcore 80 as core 27 on socket 0 00:06:47.630 EAL: Detected lcore 81 as core 28 on socket 0 00:06:47.630 EAL: Detected lcore 82 as core 29 on socket 0 00:06:47.630 EAL: Detected lcore 83 as core 30 on socket 0 00:06:47.630 EAL: Detected lcore 84 as core 0 on socket 1 00:06:47.630 EAL: Detected lcore 85 as core 1 on socket 1 00:06:47.630 EAL: Detected lcore 86 as core 2 on socket 1 00:06:47.630 EAL: Detected lcore 87 as core 3 on socket 1 00:06:47.630 EAL: Detected lcore 88 as core 4 on socket 1 00:06:47.630 EAL: Detected lcore 89 as core 5 on socket 1 00:06:47.630 EAL: Detected lcore 90 as core 6 on socket 1 00:06:47.630 EAL: Detected lcore 91 as core 8 on socket 1 00:06:47.630 EAL: Detected lcore 92 as core 9 on socket 1 00:06:47.630 EAL: Detected lcore 93 as core 10 on socket 1 00:06:47.630 EAL: Detected lcore 94 as core 11 on socket 1 00:06:47.630 EAL: Detected lcore 95 as core 12 on socket 1 00:06:47.630 EAL: Detected lcore 96 as core 13 on socket 1 00:06:47.630 EAL: Detected lcore 97 as core 14 on socket 1 00:06:47.630 EAL: Detected lcore 98 as core 16 on socket 1 00:06:47.630 EAL: Detected lcore 99 as core 17 on socket 1 00:06:47.630 EAL: Detected lcore 100 as core 18 on socket 1 00:06:47.630 EAL: Detected lcore 101 as core 19 on socket 1 00:06:47.630 EAL: Detected lcore 102 as core 20 on socket 1 00:06:47.630 EAL: Detected lcore 103 as core 21 on socket 1 00:06:47.630 EAL: Detected lcore 104 as core 22 on socket 1 00:06:47.630 EAL: Detected lcore 105 as core 24 on socket 1 00:06:47.630 EAL: Detected lcore 106 as core 25 on socket 1 00:06:47.630 EAL: Detected lcore 107 as core 26 on socket 1 00:06:47.630 EAL: Detected lcore 108 as core 27 on socket 1 00:06:47.630 EAL: Detected lcore 109 as core 28 on socket 1 00:06:47.630 EAL: Detected lcore 110 as core 29 on socket 1 00:06:47.630 EAL: Detected lcore 111 as core 30 on socket 1 00:06:47.630 EAL: Maximum logical cores by configuration: 128 00:06:47.630 EAL: Detected CPU lcores: 112 00:06:47.630 EAL: Detected NUMA nodes: 2 00:06:47.630 EAL: Checking presence of .so 'librte_eal.so.23.0' 00:06:47.630 EAL: Checking presence of .so 'librte_eal.so.23' 00:06:47.630 EAL: Checking presence of .so 'librte_eal.so' 00:06:47.630 EAL: Detected static linkage of DPDK 00:06:47.630 EAL: No shared files mode enabled, IPC will be disabled 00:06:47.630 EAL: Bus pci wants IOVA as 'DC' 00:06:47.630 EAL: Buses did not request a specific IOVA mode. 00:06:47.630 EAL: IOMMU is available, selecting IOVA as VA mode. 00:06:47.630 EAL: Selected IOVA mode 'VA' 00:06:47.630 EAL: Probing VFIO support... 00:06:47.630 EAL: IOMMU type 1 (Type 1) is supported 00:06:47.630 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:47.630 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:47.630 EAL: VFIO support initialized 00:06:47.630 EAL: Ask a virtual area of 0x2e000 bytes 00:06:47.630 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:47.630 EAL: Setting up physically contiguous memory... 00:06:47.630 EAL: Setting maximum number of open files to 524288 00:06:47.630 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:47.630 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:47.630 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:47.630 EAL: Ask a virtual area of 0x61000 bytes 00:06:47.630 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:47.630 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:47.630 EAL: Ask a virtual area of 0x400000000 bytes 00:06:47.630 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:47.630 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:47.630 EAL: Ask a virtual area of 0x61000 bytes 00:06:47.630 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:47.630 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:47.630 EAL: Ask a virtual area of 0x400000000 bytes 00:06:47.630 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:47.630 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:47.630 EAL: Ask a virtual area of 0x61000 bytes 00:06:47.630 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:47.630 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:47.630 EAL: Ask a virtual area of 0x400000000 bytes 00:06:47.630 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:47.630 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:47.630 EAL: Ask a virtual area of 0x61000 bytes 00:06:47.630 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:47.630 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:47.630 EAL: Ask a virtual area of 0x400000000 bytes 00:06:47.630 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:47.630 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:47.630 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:47.630 EAL: Ask a virtual area of 0x61000 bytes 00:06:47.630 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:47.630 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:47.630 EAL: Ask a virtual area of 0x400000000 bytes 00:06:47.630 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:47.630 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:47.630 EAL: Ask a virtual area of 0x61000 bytes 00:06:47.630 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:47.630 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:47.630 EAL: Ask a virtual area of 0x400000000 bytes 00:06:47.630 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:47.630 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:47.630 EAL: Ask a virtual area of 0x61000 bytes 00:06:47.630 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:47.631 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:47.631 EAL: Ask a virtual area of 0x400000000 bytes 00:06:47.631 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:47.631 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:47.631 EAL: Ask a virtual area of 0x61000 bytes 00:06:47.631 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:47.631 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:47.631 EAL: Ask a virtual area of 0x400000000 bytes 00:06:47.631 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:47.631 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:47.631 EAL: Hugepages will be freed exactly as allocated. 00:06:47.631 EAL: No shared files mode enabled, IPC is disabled 00:06:47.631 EAL: No shared files mode enabled, IPC is disabled 00:06:47.631 EAL: TSC frequency is ~2500000 KHz 00:06:47.631 EAL: Main lcore 0 is ready (tid=7f06b5fe0a00;cpuset=[0]) 00:06:47.631 EAL: Trying to obtain current memory policy. 00:06:47.631 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:47.631 EAL: Restoring previous memory policy: 0 00:06:47.631 EAL: request: mp_malloc_sync 00:06:47.631 EAL: No shared files mode enabled, IPC is disabled 00:06:47.631 EAL: Heap on socket 0 was expanded by 2MB 00:06:47.631 EAL: No shared files mode enabled, IPC is disabled 00:06:47.631 EAL: Mem event callback 'spdk:(nil)' registered 00:06:47.631 00:06:47.631 00:06:47.631 CUnit - A unit testing framework for C - Version 2.1-3 00:06:47.631 http://cunit.sourceforge.net/ 00:06:47.631 00:06:47.631 00:06:47.631 Suite: components_suite 00:06:47.631 Test: vtophys_malloc_test ...passed 00:06:47.631 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:47.631 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:47.631 EAL: Restoring previous memory policy: 4 00:06:47.631 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.631 EAL: request: mp_malloc_sync 00:06:47.631 EAL: No shared files mode enabled, IPC is disabled 00:06:47.631 EAL: Heap on socket 0 was expanded by 4MB 00:06:47.631 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.631 EAL: request: mp_malloc_sync 00:06:47.631 EAL: No shared files mode enabled, IPC is disabled 00:06:47.631 EAL: Heap on socket 0 was shrunk by 4MB 00:06:47.631 EAL: Trying to obtain current memory policy. 00:06:47.631 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:47.631 EAL: Restoring previous memory policy: 4 00:06:47.631 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.631 EAL: request: mp_malloc_sync 00:06:47.631 EAL: No shared files mode enabled, IPC is disabled 00:06:47.631 EAL: Heap on socket 0 was expanded by 6MB 00:06:47.631 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.631 EAL: request: mp_malloc_sync 00:06:47.631 EAL: No shared files mode enabled, IPC is disabled 00:06:47.631 EAL: Heap on socket 0 was shrunk by 6MB 00:06:47.631 EAL: Trying to obtain current memory policy. 00:06:47.631 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:47.631 EAL: Restoring previous memory policy: 4 00:06:47.631 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.631 EAL: request: mp_malloc_sync 00:06:47.631 EAL: No shared files mode enabled, IPC is disabled 00:06:47.631 EAL: Heap on socket 0 was expanded by 10MB 00:06:47.631 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.631 EAL: request: mp_malloc_sync 00:06:47.631 EAL: No shared files mode enabled, IPC is disabled 00:06:47.631 EAL: Heap on socket 0 was shrunk by 10MB 00:06:47.631 EAL: Trying to obtain current memory policy. 00:06:47.631 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:47.631 EAL: Restoring previous memory policy: 4 00:06:47.631 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.631 EAL: request: mp_malloc_sync 00:06:47.631 EAL: No shared files mode enabled, IPC is disabled 00:06:47.631 EAL: Heap on socket 0 was expanded by 18MB 00:06:47.631 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.631 EAL: request: mp_malloc_sync 00:06:47.631 EAL: No shared files mode enabled, IPC is disabled 00:06:47.631 EAL: Heap on socket 0 was shrunk by 18MB 00:06:47.631 EAL: Trying to obtain current memory policy. 00:06:47.631 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:47.631 EAL: Restoring previous memory policy: 4 00:06:47.631 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.631 EAL: request: mp_malloc_sync 00:06:47.631 EAL: No shared files mode enabled, IPC is disabled 00:06:47.631 EAL: Heap on socket 0 was expanded by 34MB 00:06:47.631 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.631 EAL: request: mp_malloc_sync 00:06:47.631 EAL: No shared files mode enabled, IPC is disabled 00:06:47.631 EAL: Heap on socket 0 was shrunk by 34MB 00:06:47.631 EAL: Trying to obtain current memory policy. 00:06:47.631 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:47.631 EAL: Restoring previous memory policy: 4 00:06:47.631 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.631 EAL: request: mp_malloc_sync 00:06:47.631 EAL: No shared files mode enabled, IPC is disabled 00:06:47.631 EAL: Heap on socket 0 was expanded by 66MB 00:06:47.631 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.631 EAL: request: mp_malloc_sync 00:06:47.631 EAL: No shared files mode enabled, IPC is disabled 00:06:47.631 EAL: Heap on socket 0 was shrunk by 66MB 00:06:47.631 EAL: Trying to obtain current memory policy. 00:06:47.631 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:47.631 EAL: Restoring previous memory policy: 4 00:06:47.631 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.631 EAL: request: mp_malloc_sync 00:06:47.631 EAL: No shared files mode enabled, IPC is disabled 00:06:47.631 EAL: Heap on socket 0 was expanded by 130MB 00:06:47.631 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.631 EAL: request: mp_malloc_sync 00:06:47.631 EAL: No shared files mode enabled, IPC is disabled 00:06:47.631 EAL: Heap on socket 0 was shrunk by 130MB 00:06:47.631 EAL: Trying to obtain current memory policy. 00:06:47.631 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:47.891 EAL: Restoring previous memory policy: 4 00:06:47.891 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.891 EAL: request: mp_malloc_sync 00:06:47.891 EAL: No shared files mode enabled, IPC is disabled 00:06:47.892 EAL: Heap on socket 0 was expanded by 258MB 00:06:47.892 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.892 EAL: request: mp_malloc_sync 00:06:47.892 EAL: No shared files mode enabled, IPC is disabled 00:06:47.892 EAL: Heap on socket 0 was shrunk by 258MB 00:06:47.892 EAL: Trying to obtain current memory policy. 00:06:47.892 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:47.892 EAL: Restoring previous memory policy: 4 00:06:47.892 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.892 EAL: request: mp_malloc_sync 00:06:47.892 EAL: No shared files mode enabled, IPC is disabled 00:06:47.892 EAL: Heap on socket 0 was expanded by 514MB 00:06:48.152 EAL: Calling mem event callback 'spdk:(nil)' 00:06:48.152 EAL: request: mp_malloc_sync 00:06:48.152 EAL: No shared files mode enabled, IPC is disabled 00:06:48.152 EAL: Heap on socket 0 was shrunk by 514MB 00:06:48.152 EAL: Trying to obtain current memory policy. 00:06:48.152 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:48.412 EAL: Restoring previous memory policy: 4 00:06:48.412 EAL: Calling mem event callback 'spdk:(nil)' 00:06:48.412 EAL: request: mp_malloc_sync 00:06:48.412 EAL: No shared files mode enabled, IPC is disabled 00:06:48.412 EAL: Heap on socket 0 was expanded by 1026MB 00:06:48.412 EAL: Calling mem event callback 'spdk:(nil)' 00:06:48.673 EAL: request: mp_malloc_sync 00:06:48.673 EAL: No shared files mode enabled, IPC is disabled 00:06:48.673 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:48.673 passed 00:06:48.673 00:06:48.673 Run Summary: Type Total Ran Passed Failed Inactive 00:06:48.673 suites 1 1 n/a 0 0 00:06:48.673 tests 2 2 2 0 0 00:06:48.673 asserts 497 497 497 0 n/a 00:06:48.673 00:06:48.673 Elapsed time = 0.972 seconds 00:06:48.673 EAL: Calling mem event callback 'spdk:(nil)' 00:06:48.673 EAL: request: mp_malloc_sync 00:06:48.673 EAL: No shared files mode enabled, IPC is disabled 00:06:48.673 EAL: Heap on socket 0 was shrunk by 2MB 00:06:48.673 EAL: No shared files mode enabled, IPC is disabled 00:06:48.673 EAL: No shared files mode enabled, IPC is disabled 00:06:48.673 EAL: No shared files mode enabled, IPC is disabled 00:06:48.673 00:06:48.673 real 0m1.105s 00:06:48.673 user 0m0.648s 00:06:48.673 sys 0m0.432s 00:06:48.673 04:23:27 env.env_vtophys -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:48.673 04:23:27 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:48.673 ************************************ 00:06:48.673 END TEST env_vtophys 00:06:48.673 ************************************ 00:06:48.673 04:23:27 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:48.673 04:23:27 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:48.673 04:23:27 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:48.673 04:23:27 env -- common/autotest_common.sh@10 -- # set +x 00:06:48.673 ************************************ 00:06:48.673 START TEST env_pci 00:06:48.673 ************************************ 00:06:48.673 04:23:27 env.env_pci -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:48.673 00:06:48.673 00:06:48.673 CUnit - A unit testing framework for C - Version 2.1-3 00:06:48.673 http://cunit.sourceforge.net/ 00:06:48.673 00:06:48.673 00:06:48.673 Suite: pci 00:06:48.673 Test: pci_hook ...[2024-11-17 04:23:27.436927] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1118:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 133672 has claimed it 00:06:48.673 EAL: Cannot find device (10000:00:01.0) 00:06:48.673 EAL: Failed to attach device on primary process 00:06:48.673 passed 00:06:48.673 00:06:48.673 Run Summary: Type Total Ran Passed Failed Inactive 00:06:48.673 suites 1 1 n/a 0 0 00:06:48.673 tests 1 1 1 0 0 00:06:48.673 asserts 25 25 25 0 n/a 00:06:48.673 00:06:48.673 Elapsed time = 0.034 seconds 00:06:48.673 00:06:48.673 real 0m0.053s 00:06:48.673 user 0m0.016s 00:06:48.673 sys 0m0.037s 00:06:48.673 04:23:27 env.env_pci -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:48.673 04:23:27 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:48.673 ************************************ 00:06:48.673 END TEST env_pci 00:06:48.673 ************************************ 00:06:48.934 04:23:27 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:48.934 04:23:27 env -- env/env.sh@15 -- # uname 00:06:48.934 04:23:27 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:48.934 04:23:27 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:48.934 04:23:27 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:48.934 04:23:27 env -- common/autotest_common.sh@1105 -- # '[' 5 -le 1 ']' 00:06:48.934 04:23:27 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:48.934 04:23:27 env -- common/autotest_common.sh@10 -- # set +x 00:06:48.934 ************************************ 00:06:48.934 START TEST env_dpdk_post_init 00:06:48.934 ************************************ 00:06:48.934 04:23:27 env.env_dpdk_post_init -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:48.934 EAL: Detected CPU lcores: 112 00:06:48.934 EAL: Detected NUMA nodes: 2 00:06:48.934 EAL: Detected static linkage of DPDK 00:06:48.934 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:48.934 EAL: Selected IOVA mode 'VA' 00:06:48.934 EAL: VFIO support initialized 00:06:48.934 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:48.934 EAL: Using IOMMU type 1 (Type 1) 00:06:49.874 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:06:53.185 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:06:53.185 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:06:53.756 Starting DPDK initialization... 00:06:53.756 Starting SPDK post initialization... 00:06:53.756 SPDK NVMe probe 00:06:53.756 Attaching to 0000:d8:00.0 00:06:53.756 Attached to 0000:d8:00.0 00:06:53.756 Cleaning up... 00:06:53.756 00:06:53.756 real 0m4.769s 00:06:53.756 user 0m3.585s 00:06:53.756 sys 0m0.430s 00:06:53.756 04:23:32 env.env_dpdk_post_init -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:53.756 04:23:32 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:53.756 ************************************ 00:06:53.756 END TEST env_dpdk_post_init 00:06:53.756 ************************************ 00:06:53.756 04:23:32 env -- env/env.sh@26 -- # uname 00:06:53.756 04:23:32 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:53.756 04:23:32 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:53.756 04:23:32 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:53.756 04:23:32 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:53.756 04:23:32 env -- common/autotest_common.sh@10 -- # set +x 00:06:53.756 ************************************ 00:06:53.756 START TEST env_mem_callbacks 00:06:53.756 ************************************ 00:06:53.756 04:23:32 env.env_mem_callbacks -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:53.756 EAL: Detected CPU lcores: 112 00:06:53.756 EAL: Detected NUMA nodes: 2 00:06:53.756 EAL: Detected static linkage of DPDK 00:06:53.756 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:53.756 EAL: Selected IOVA mode 'VA' 00:06:53.756 EAL: VFIO support initialized 00:06:53.756 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:53.756 00:06:53.756 00:06:53.756 CUnit - A unit testing framework for C - Version 2.1-3 00:06:53.756 http://cunit.sourceforge.net/ 00:06:53.756 00:06:53.756 00:06:53.756 Suite: memory 00:06:53.756 Test: test ... 00:06:53.756 register 0x200000200000 2097152 00:06:53.756 malloc 3145728 00:06:53.756 register 0x200000400000 4194304 00:06:53.756 buf 0x200000500000 len 3145728 PASSED 00:06:53.756 malloc 64 00:06:53.756 buf 0x2000004fff40 len 64 PASSED 00:06:53.756 malloc 4194304 00:06:53.756 register 0x200000800000 6291456 00:06:53.756 buf 0x200000a00000 len 4194304 PASSED 00:06:53.756 free 0x200000500000 3145728 00:06:53.756 free 0x2000004fff40 64 00:06:53.756 unregister 0x200000400000 4194304 PASSED 00:06:53.756 free 0x200000a00000 4194304 00:06:53.756 unregister 0x200000800000 6291456 PASSED 00:06:53.756 malloc 8388608 00:06:53.756 register 0x200000400000 10485760 00:06:53.756 buf 0x200000600000 len 8388608 PASSED 00:06:53.756 free 0x200000600000 8388608 00:06:53.756 unregister 0x200000400000 10485760 PASSED 00:06:53.756 passed 00:06:53.756 00:06:53.756 Run Summary: Type Total Ran Passed Failed Inactive 00:06:53.756 suites 1 1 n/a 0 0 00:06:53.756 tests 1 1 1 0 0 00:06:53.756 asserts 15 15 15 0 n/a 00:06:53.756 00:06:53.756 Elapsed time = 0.008 seconds 00:06:53.756 00:06:53.756 real 0m0.070s 00:06:53.756 user 0m0.016s 00:06:53.756 sys 0m0.054s 00:06:53.756 04:23:32 env.env_mem_callbacks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:53.756 04:23:32 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:53.756 ************************************ 00:06:53.756 END TEST env_mem_callbacks 00:06:53.756 ************************************ 00:06:53.756 00:06:53.756 real 0m6.698s 00:06:53.756 user 0m4.581s 00:06:53.756 sys 0m1.387s 00:06:53.756 04:23:32 env -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:53.756 04:23:32 env -- common/autotest_common.sh@10 -- # set +x 00:06:53.756 ************************************ 00:06:53.756 END TEST env 00:06:53.756 ************************************ 00:06:53.756 04:23:32 -- spdk/autotest.sh@156 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:53.756 04:23:32 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:53.756 04:23:32 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:53.756 04:23:32 -- common/autotest_common.sh@10 -- # set +x 00:06:54.017 ************************************ 00:06:54.017 START TEST rpc 00:06:54.017 ************************************ 00:06:54.017 04:23:32 rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:54.017 * Looking for test storage... 00:06:54.017 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:54.017 04:23:32 rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:54.017 04:23:32 rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:06:54.017 04:23:32 rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:54.017 04:23:32 rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:54.017 04:23:32 rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:54.017 04:23:32 rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:54.017 04:23:32 rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:54.017 04:23:32 rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:54.017 04:23:32 rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:54.017 04:23:32 rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:54.017 04:23:32 rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:54.017 04:23:32 rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:54.017 04:23:32 rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:54.017 04:23:32 rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:54.017 04:23:32 rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:54.017 04:23:32 rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:54.017 04:23:32 rpc -- scripts/common.sh@345 -- # : 1 00:06:54.017 04:23:32 rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:54.017 04:23:32 rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:54.017 04:23:32 rpc -- scripts/common.sh@365 -- # decimal 1 00:06:54.017 04:23:32 rpc -- scripts/common.sh@353 -- # local d=1 00:06:54.017 04:23:32 rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:54.017 04:23:32 rpc -- scripts/common.sh@355 -- # echo 1 00:06:54.017 04:23:32 rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:54.017 04:23:32 rpc -- scripts/common.sh@366 -- # decimal 2 00:06:54.017 04:23:32 rpc -- scripts/common.sh@353 -- # local d=2 00:06:54.017 04:23:32 rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:54.017 04:23:32 rpc -- scripts/common.sh@355 -- # echo 2 00:06:54.017 04:23:32 rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:54.017 04:23:32 rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:54.017 04:23:32 rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:54.017 04:23:32 rpc -- scripts/common.sh@368 -- # return 0 00:06:54.017 04:23:32 rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:54.017 04:23:32 rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:54.017 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:54.017 --rc genhtml_branch_coverage=1 00:06:54.017 --rc genhtml_function_coverage=1 00:06:54.017 --rc genhtml_legend=1 00:06:54.017 --rc geninfo_all_blocks=1 00:06:54.017 --rc geninfo_unexecuted_blocks=1 00:06:54.017 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:54.017 ' 00:06:54.017 04:23:32 rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:54.017 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:54.017 --rc genhtml_branch_coverage=1 00:06:54.017 --rc genhtml_function_coverage=1 00:06:54.017 --rc genhtml_legend=1 00:06:54.017 --rc geninfo_all_blocks=1 00:06:54.017 --rc geninfo_unexecuted_blocks=1 00:06:54.017 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:54.017 ' 00:06:54.017 04:23:32 rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:54.017 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:54.017 --rc genhtml_branch_coverage=1 00:06:54.017 --rc genhtml_function_coverage=1 00:06:54.017 --rc genhtml_legend=1 00:06:54.017 --rc geninfo_all_blocks=1 00:06:54.017 --rc geninfo_unexecuted_blocks=1 00:06:54.017 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:54.017 ' 00:06:54.017 04:23:32 rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:54.017 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:54.017 --rc genhtml_branch_coverage=1 00:06:54.017 --rc genhtml_function_coverage=1 00:06:54.017 --rc genhtml_legend=1 00:06:54.017 --rc geninfo_all_blocks=1 00:06:54.017 --rc geninfo_unexecuted_blocks=1 00:06:54.017 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:54.017 ' 00:06:54.017 04:23:32 rpc -- rpc/rpc.sh@65 -- # spdk_pid=134843 00:06:54.017 04:23:32 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:54.017 04:23:32 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:54.017 04:23:32 rpc -- rpc/rpc.sh@67 -- # waitforlisten 134843 00:06:54.017 04:23:32 rpc -- common/autotest_common.sh@835 -- # '[' -z 134843 ']' 00:06:54.017 04:23:32 rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:54.017 04:23:32 rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:54.017 04:23:32 rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:54.017 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:54.017 04:23:32 rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:54.017 04:23:32 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:54.278 [2024-11-17 04:23:32.850887] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:06:54.278 [2024-11-17 04:23:32.850977] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid134843 ] 00:06:54.278 [2024-11-17 04:23:32.936247] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.278 [2024-11-17 04:23:32.957439] app.c: 612:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:54.278 [2024-11-17 04:23:32.957475] app.c: 616:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 134843' to capture a snapshot of events at runtime. 00:06:54.278 [2024-11-17 04:23:32.957485] app.c: 618:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:54.278 [2024-11-17 04:23:32.957494] app.c: 619:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:54.278 [2024-11-17 04:23:32.957500] app.c: 620:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid134843 for offline analysis/debug. 00:06:54.278 [2024-11-17 04:23:32.958079] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.539 04:23:33 rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:54.539 04:23:33 rpc -- common/autotest_common.sh@868 -- # return 0 00:06:54.539 04:23:33 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:54.539 04:23:33 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:54.539 04:23:33 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:54.539 04:23:33 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:54.539 04:23:33 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:54.539 04:23:33 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:54.539 04:23:33 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:54.539 ************************************ 00:06:54.539 START TEST rpc_integrity 00:06:54.539 ************************************ 00:06:54.539 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@1129 -- # rpc_integrity 00:06:54.539 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:54.539 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.539 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:54.539 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.539 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:54.539 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:54.539 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:54.539 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:54.539 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.539 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:54.539 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.539 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:54.539 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:54.539 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.539 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:54.539 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.539 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:54.539 { 00:06:54.539 "name": "Malloc0", 00:06:54.539 "aliases": [ 00:06:54.539 "acb78492-4eed-4fff-802b-c348ab1b7b21" 00:06:54.539 ], 00:06:54.539 "product_name": "Malloc disk", 00:06:54.539 "block_size": 512, 00:06:54.539 "num_blocks": 16384, 00:06:54.539 "uuid": "acb78492-4eed-4fff-802b-c348ab1b7b21", 00:06:54.539 "assigned_rate_limits": { 00:06:54.539 "rw_ios_per_sec": 0, 00:06:54.539 "rw_mbytes_per_sec": 0, 00:06:54.539 "r_mbytes_per_sec": 0, 00:06:54.539 "w_mbytes_per_sec": 0 00:06:54.539 }, 00:06:54.539 "claimed": false, 00:06:54.539 "zoned": false, 00:06:54.539 "supported_io_types": { 00:06:54.539 "read": true, 00:06:54.539 "write": true, 00:06:54.539 "unmap": true, 00:06:54.539 "flush": true, 00:06:54.539 "reset": true, 00:06:54.539 "nvme_admin": false, 00:06:54.539 "nvme_io": false, 00:06:54.539 "nvme_io_md": false, 00:06:54.539 "write_zeroes": true, 00:06:54.539 "zcopy": true, 00:06:54.539 "get_zone_info": false, 00:06:54.539 "zone_management": false, 00:06:54.539 "zone_append": false, 00:06:54.539 "compare": false, 00:06:54.539 "compare_and_write": false, 00:06:54.539 "abort": true, 00:06:54.539 "seek_hole": false, 00:06:54.539 "seek_data": false, 00:06:54.539 "copy": true, 00:06:54.539 "nvme_iov_md": false 00:06:54.539 }, 00:06:54.539 "memory_domains": [ 00:06:54.539 { 00:06:54.539 "dma_device_id": "system", 00:06:54.539 "dma_device_type": 1 00:06:54.539 }, 00:06:54.539 { 00:06:54.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:54.539 "dma_device_type": 2 00:06:54.539 } 00:06:54.539 ], 00:06:54.539 "driver_specific": {} 00:06:54.539 } 00:06:54.539 ]' 00:06:54.539 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:54.539 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:54.539 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:54.539 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.539 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:54.539 [2024-11-17 04:23:33.337492] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:54.539 [2024-11-17 04:23:33.337525] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:54.539 [2024-11-17 04:23:33.337542] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4fd98c0 00:06:54.539 [2024-11-17 04:23:33.337551] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:54.539 [2024-11-17 04:23:33.338457] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:54.539 [2024-11-17 04:23:33.338482] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:54.539 Passthru0 00:06:54.539 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.539 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:54.539 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.539 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:54.800 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.800 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:54.800 { 00:06:54.800 "name": "Malloc0", 00:06:54.800 "aliases": [ 00:06:54.800 "acb78492-4eed-4fff-802b-c348ab1b7b21" 00:06:54.800 ], 00:06:54.800 "product_name": "Malloc disk", 00:06:54.800 "block_size": 512, 00:06:54.800 "num_blocks": 16384, 00:06:54.800 "uuid": "acb78492-4eed-4fff-802b-c348ab1b7b21", 00:06:54.800 "assigned_rate_limits": { 00:06:54.800 "rw_ios_per_sec": 0, 00:06:54.800 "rw_mbytes_per_sec": 0, 00:06:54.800 "r_mbytes_per_sec": 0, 00:06:54.800 "w_mbytes_per_sec": 0 00:06:54.800 }, 00:06:54.800 "claimed": true, 00:06:54.800 "claim_type": "exclusive_write", 00:06:54.800 "zoned": false, 00:06:54.800 "supported_io_types": { 00:06:54.800 "read": true, 00:06:54.800 "write": true, 00:06:54.800 "unmap": true, 00:06:54.800 "flush": true, 00:06:54.800 "reset": true, 00:06:54.800 "nvme_admin": false, 00:06:54.800 "nvme_io": false, 00:06:54.800 "nvme_io_md": false, 00:06:54.800 "write_zeroes": true, 00:06:54.800 "zcopy": true, 00:06:54.800 "get_zone_info": false, 00:06:54.800 "zone_management": false, 00:06:54.800 "zone_append": false, 00:06:54.800 "compare": false, 00:06:54.800 "compare_and_write": false, 00:06:54.800 "abort": true, 00:06:54.800 "seek_hole": false, 00:06:54.800 "seek_data": false, 00:06:54.800 "copy": true, 00:06:54.800 "nvme_iov_md": false 00:06:54.800 }, 00:06:54.800 "memory_domains": [ 00:06:54.800 { 00:06:54.800 "dma_device_id": "system", 00:06:54.800 "dma_device_type": 1 00:06:54.800 }, 00:06:54.800 { 00:06:54.800 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:54.800 "dma_device_type": 2 00:06:54.800 } 00:06:54.800 ], 00:06:54.800 "driver_specific": {} 00:06:54.800 }, 00:06:54.800 { 00:06:54.800 "name": "Passthru0", 00:06:54.800 "aliases": [ 00:06:54.800 "3e0c9d6b-84bd-5bfb-8337-64b42d5bdc8a" 00:06:54.800 ], 00:06:54.800 "product_name": "passthru", 00:06:54.800 "block_size": 512, 00:06:54.800 "num_blocks": 16384, 00:06:54.800 "uuid": "3e0c9d6b-84bd-5bfb-8337-64b42d5bdc8a", 00:06:54.800 "assigned_rate_limits": { 00:06:54.800 "rw_ios_per_sec": 0, 00:06:54.800 "rw_mbytes_per_sec": 0, 00:06:54.800 "r_mbytes_per_sec": 0, 00:06:54.800 "w_mbytes_per_sec": 0 00:06:54.800 }, 00:06:54.800 "claimed": false, 00:06:54.800 "zoned": false, 00:06:54.800 "supported_io_types": { 00:06:54.800 "read": true, 00:06:54.800 "write": true, 00:06:54.800 "unmap": true, 00:06:54.800 "flush": true, 00:06:54.800 "reset": true, 00:06:54.800 "nvme_admin": false, 00:06:54.800 "nvme_io": false, 00:06:54.800 "nvme_io_md": false, 00:06:54.800 "write_zeroes": true, 00:06:54.800 "zcopy": true, 00:06:54.800 "get_zone_info": false, 00:06:54.800 "zone_management": false, 00:06:54.800 "zone_append": false, 00:06:54.800 "compare": false, 00:06:54.800 "compare_and_write": false, 00:06:54.800 "abort": true, 00:06:54.800 "seek_hole": false, 00:06:54.800 "seek_data": false, 00:06:54.800 "copy": true, 00:06:54.800 "nvme_iov_md": false 00:06:54.800 }, 00:06:54.800 "memory_domains": [ 00:06:54.800 { 00:06:54.800 "dma_device_id": "system", 00:06:54.800 "dma_device_type": 1 00:06:54.800 }, 00:06:54.800 { 00:06:54.800 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:54.800 "dma_device_type": 2 00:06:54.800 } 00:06:54.800 ], 00:06:54.800 "driver_specific": { 00:06:54.800 "passthru": { 00:06:54.800 "name": "Passthru0", 00:06:54.800 "base_bdev_name": "Malloc0" 00:06:54.800 } 00:06:54.800 } 00:06:54.800 } 00:06:54.800 ]' 00:06:54.800 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:54.800 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:54.800 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:54.800 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.800 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:54.800 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.800 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:54.800 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.800 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:54.800 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.800 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:54.800 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.800 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:54.800 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.800 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:54.800 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:54.800 04:23:33 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:54.800 00:06:54.800 real 0m0.280s 00:06:54.800 user 0m0.166s 00:06:54.800 sys 0m0.054s 00:06:54.800 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:54.800 04:23:33 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:54.800 ************************************ 00:06:54.800 END TEST rpc_integrity 00:06:54.800 ************************************ 00:06:54.801 04:23:33 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:54.801 04:23:33 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:54.801 04:23:33 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:54.801 04:23:33 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:54.801 ************************************ 00:06:54.801 START TEST rpc_plugins 00:06:54.801 ************************************ 00:06:54.801 04:23:33 rpc.rpc_plugins -- common/autotest_common.sh@1129 -- # rpc_plugins 00:06:54.801 04:23:33 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:54.801 04:23:33 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.801 04:23:33 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:54.801 04:23:33 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.801 04:23:33 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:54.801 04:23:33 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:54.801 04:23:33 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.801 04:23:33 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:54.801 04:23:33 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.801 04:23:33 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:54.801 { 00:06:54.801 "name": "Malloc1", 00:06:54.801 "aliases": [ 00:06:54.801 "2dcd8183-633d-41a3-85db-30a6d7a33c6b" 00:06:54.801 ], 00:06:54.801 "product_name": "Malloc disk", 00:06:54.801 "block_size": 4096, 00:06:54.801 "num_blocks": 256, 00:06:54.801 "uuid": "2dcd8183-633d-41a3-85db-30a6d7a33c6b", 00:06:54.801 "assigned_rate_limits": { 00:06:54.801 "rw_ios_per_sec": 0, 00:06:54.801 "rw_mbytes_per_sec": 0, 00:06:54.801 "r_mbytes_per_sec": 0, 00:06:54.801 "w_mbytes_per_sec": 0 00:06:54.801 }, 00:06:54.801 "claimed": false, 00:06:54.801 "zoned": false, 00:06:54.801 "supported_io_types": { 00:06:54.801 "read": true, 00:06:54.801 "write": true, 00:06:54.801 "unmap": true, 00:06:54.801 "flush": true, 00:06:54.801 "reset": true, 00:06:54.801 "nvme_admin": false, 00:06:54.801 "nvme_io": false, 00:06:54.801 "nvme_io_md": false, 00:06:54.801 "write_zeroes": true, 00:06:54.801 "zcopy": true, 00:06:54.801 "get_zone_info": false, 00:06:54.801 "zone_management": false, 00:06:54.801 "zone_append": false, 00:06:54.801 "compare": false, 00:06:54.801 "compare_and_write": false, 00:06:54.801 "abort": true, 00:06:54.801 "seek_hole": false, 00:06:54.801 "seek_data": false, 00:06:54.801 "copy": true, 00:06:54.801 "nvme_iov_md": false 00:06:54.801 }, 00:06:54.801 "memory_domains": [ 00:06:54.801 { 00:06:54.801 "dma_device_id": "system", 00:06:54.801 "dma_device_type": 1 00:06:54.801 }, 00:06:54.801 { 00:06:54.801 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:54.801 "dma_device_type": 2 00:06:54.801 } 00:06:54.801 ], 00:06:54.801 "driver_specific": {} 00:06:54.801 } 00:06:54.801 ]' 00:06:54.801 04:23:33 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:55.061 04:23:33 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:55.061 04:23:33 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:55.061 04:23:33 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:55.061 04:23:33 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:55.061 04:23:33 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:55.061 04:23:33 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:55.061 04:23:33 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:55.061 04:23:33 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:55.061 04:23:33 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:55.061 04:23:33 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:55.061 04:23:33 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:55.061 04:23:33 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:55.061 00:06:55.061 real 0m0.155s 00:06:55.061 user 0m0.093s 00:06:55.061 sys 0m0.026s 00:06:55.061 04:23:33 rpc.rpc_plugins -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:55.061 04:23:33 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:55.061 ************************************ 00:06:55.061 END TEST rpc_plugins 00:06:55.061 ************************************ 00:06:55.061 04:23:33 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:55.061 04:23:33 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:55.061 04:23:33 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:55.061 04:23:33 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:55.061 ************************************ 00:06:55.061 START TEST rpc_trace_cmd_test 00:06:55.061 ************************************ 00:06:55.061 04:23:33 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1129 -- # rpc_trace_cmd_test 00:06:55.061 04:23:33 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:55.061 04:23:33 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:55.061 04:23:33 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:55.061 04:23:33 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:55.061 04:23:33 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:55.061 04:23:33 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:55.061 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid134843", 00:06:55.061 "tpoint_group_mask": "0x8", 00:06:55.061 "iscsi_conn": { 00:06:55.061 "mask": "0x2", 00:06:55.061 "tpoint_mask": "0x0" 00:06:55.061 }, 00:06:55.061 "scsi": { 00:06:55.061 "mask": "0x4", 00:06:55.061 "tpoint_mask": "0x0" 00:06:55.061 }, 00:06:55.061 "bdev": { 00:06:55.061 "mask": "0x8", 00:06:55.061 "tpoint_mask": "0xffffffffffffffff" 00:06:55.061 }, 00:06:55.061 "nvmf_rdma": { 00:06:55.061 "mask": "0x10", 00:06:55.061 "tpoint_mask": "0x0" 00:06:55.061 }, 00:06:55.061 "nvmf_tcp": { 00:06:55.061 "mask": "0x20", 00:06:55.061 "tpoint_mask": "0x0" 00:06:55.061 }, 00:06:55.061 "ftl": { 00:06:55.061 "mask": "0x40", 00:06:55.061 "tpoint_mask": "0x0" 00:06:55.061 }, 00:06:55.061 "blobfs": { 00:06:55.061 "mask": "0x80", 00:06:55.061 "tpoint_mask": "0x0" 00:06:55.061 }, 00:06:55.061 "dsa": { 00:06:55.061 "mask": "0x200", 00:06:55.061 "tpoint_mask": "0x0" 00:06:55.061 }, 00:06:55.061 "thread": { 00:06:55.061 "mask": "0x400", 00:06:55.061 "tpoint_mask": "0x0" 00:06:55.061 }, 00:06:55.061 "nvme_pcie": { 00:06:55.061 "mask": "0x800", 00:06:55.061 "tpoint_mask": "0x0" 00:06:55.061 }, 00:06:55.061 "iaa": { 00:06:55.061 "mask": "0x1000", 00:06:55.061 "tpoint_mask": "0x0" 00:06:55.061 }, 00:06:55.061 "nvme_tcp": { 00:06:55.061 "mask": "0x2000", 00:06:55.061 "tpoint_mask": "0x0" 00:06:55.061 }, 00:06:55.061 "bdev_nvme": { 00:06:55.061 "mask": "0x4000", 00:06:55.061 "tpoint_mask": "0x0" 00:06:55.061 }, 00:06:55.061 "sock": { 00:06:55.061 "mask": "0x8000", 00:06:55.061 "tpoint_mask": "0x0" 00:06:55.061 }, 00:06:55.061 "blob": { 00:06:55.061 "mask": "0x10000", 00:06:55.061 "tpoint_mask": "0x0" 00:06:55.061 }, 00:06:55.061 "bdev_raid": { 00:06:55.061 "mask": "0x20000", 00:06:55.061 "tpoint_mask": "0x0" 00:06:55.061 }, 00:06:55.061 "scheduler": { 00:06:55.061 "mask": "0x40000", 00:06:55.061 "tpoint_mask": "0x0" 00:06:55.061 } 00:06:55.061 }' 00:06:55.062 04:23:33 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:55.062 04:23:33 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 19 -gt 2 ']' 00:06:55.062 04:23:33 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:55.322 04:23:33 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:55.322 04:23:33 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:55.322 04:23:33 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:55.322 04:23:33 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:55.322 04:23:33 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:55.322 04:23:33 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:55.322 04:23:34 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:55.322 00:06:55.322 real 0m0.220s 00:06:55.322 user 0m0.171s 00:06:55.322 sys 0m0.040s 00:06:55.322 04:23:34 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:55.322 04:23:34 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:55.322 ************************************ 00:06:55.322 END TEST rpc_trace_cmd_test 00:06:55.322 ************************************ 00:06:55.322 04:23:34 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:55.322 04:23:34 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:55.322 04:23:34 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:55.322 04:23:34 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:55.322 04:23:34 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:55.322 04:23:34 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:55.322 ************************************ 00:06:55.322 START TEST rpc_daemon_integrity 00:06:55.322 ************************************ 00:06:55.322 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1129 -- # rpc_integrity 00:06:55.322 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:55.322 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:55.322 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:55.322 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:55.322 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:55.322 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:55.584 { 00:06:55.584 "name": "Malloc2", 00:06:55.584 "aliases": [ 00:06:55.584 "5adbc0a6-a9ef-40ae-86c8-258abdbe9ec6" 00:06:55.584 ], 00:06:55.584 "product_name": "Malloc disk", 00:06:55.584 "block_size": 512, 00:06:55.584 "num_blocks": 16384, 00:06:55.584 "uuid": "5adbc0a6-a9ef-40ae-86c8-258abdbe9ec6", 00:06:55.584 "assigned_rate_limits": { 00:06:55.584 "rw_ios_per_sec": 0, 00:06:55.584 "rw_mbytes_per_sec": 0, 00:06:55.584 "r_mbytes_per_sec": 0, 00:06:55.584 "w_mbytes_per_sec": 0 00:06:55.584 }, 00:06:55.584 "claimed": false, 00:06:55.584 "zoned": false, 00:06:55.584 "supported_io_types": { 00:06:55.584 "read": true, 00:06:55.584 "write": true, 00:06:55.584 "unmap": true, 00:06:55.584 "flush": true, 00:06:55.584 "reset": true, 00:06:55.584 "nvme_admin": false, 00:06:55.584 "nvme_io": false, 00:06:55.584 "nvme_io_md": false, 00:06:55.584 "write_zeroes": true, 00:06:55.584 "zcopy": true, 00:06:55.584 "get_zone_info": false, 00:06:55.584 "zone_management": false, 00:06:55.584 "zone_append": false, 00:06:55.584 "compare": false, 00:06:55.584 "compare_and_write": false, 00:06:55.584 "abort": true, 00:06:55.584 "seek_hole": false, 00:06:55.584 "seek_data": false, 00:06:55.584 "copy": true, 00:06:55.584 "nvme_iov_md": false 00:06:55.584 }, 00:06:55.584 "memory_domains": [ 00:06:55.584 { 00:06:55.584 "dma_device_id": "system", 00:06:55.584 "dma_device_type": 1 00:06:55.584 }, 00:06:55.584 { 00:06:55.584 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:55.584 "dma_device_type": 2 00:06:55.584 } 00:06:55.584 ], 00:06:55.584 "driver_specific": {} 00:06:55.584 } 00:06:55.584 ]' 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:55.584 [2024-11-17 04:23:34.235979] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:55.584 [2024-11-17 04:23:34.236011] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:55.584 [2024-11-17 04:23:34.236027] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4fcfa20 00:06:55.584 [2024-11-17 04:23:34.236036] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:55.584 [2024-11-17 04:23:34.236774] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:55.584 [2024-11-17 04:23:34.236797] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:55.584 Passthru0 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:55.584 { 00:06:55.584 "name": "Malloc2", 00:06:55.584 "aliases": [ 00:06:55.584 "5adbc0a6-a9ef-40ae-86c8-258abdbe9ec6" 00:06:55.584 ], 00:06:55.584 "product_name": "Malloc disk", 00:06:55.584 "block_size": 512, 00:06:55.584 "num_blocks": 16384, 00:06:55.584 "uuid": "5adbc0a6-a9ef-40ae-86c8-258abdbe9ec6", 00:06:55.584 "assigned_rate_limits": { 00:06:55.584 "rw_ios_per_sec": 0, 00:06:55.584 "rw_mbytes_per_sec": 0, 00:06:55.584 "r_mbytes_per_sec": 0, 00:06:55.584 "w_mbytes_per_sec": 0 00:06:55.584 }, 00:06:55.584 "claimed": true, 00:06:55.584 "claim_type": "exclusive_write", 00:06:55.584 "zoned": false, 00:06:55.584 "supported_io_types": { 00:06:55.584 "read": true, 00:06:55.584 "write": true, 00:06:55.584 "unmap": true, 00:06:55.584 "flush": true, 00:06:55.584 "reset": true, 00:06:55.584 "nvme_admin": false, 00:06:55.584 "nvme_io": false, 00:06:55.584 "nvme_io_md": false, 00:06:55.584 "write_zeroes": true, 00:06:55.584 "zcopy": true, 00:06:55.584 "get_zone_info": false, 00:06:55.584 "zone_management": false, 00:06:55.584 "zone_append": false, 00:06:55.584 "compare": false, 00:06:55.584 "compare_and_write": false, 00:06:55.584 "abort": true, 00:06:55.584 "seek_hole": false, 00:06:55.584 "seek_data": false, 00:06:55.584 "copy": true, 00:06:55.584 "nvme_iov_md": false 00:06:55.584 }, 00:06:55.584 "memory_domains": [ 00:06:55.584 { 00:06:55.584 "dma_device_id": "system", 00:06:55.584 "dma_device_type": 1 00:06:55.584 }, 00:06:55.584 { 00:06:55.584 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:55.584 "dma_device_type": 2 00:06:55.584 } 00:06:55.584 ], 00:06:55.584 "driver_specific": {} 00:06:55.584 }, 00:06:55.584 { 00:06:55.584 "name": "Passthru0", 00:06:55.584 "aliases": [ 00:06:55.584 "f0f5b246-3b71-5f2c-a9b0-433234bf9d3e" 00:06:55.584 ], 00:06:55.584 "product_name": "passthru", 00:06:55.584 "block_size": 512, 00:06:55.584 "num_blocks": 16384, 00:06:55.584 "uuid": "f0f5b246-3b71-5f2c-a9b0-433234bf9d3e", 00:06:55.584 "assigned_rate_limits": { 00:06:55.584 "rw_ios_per_sec": 0, 00:06:55.584 "rw_mbytes_per_sec": 0, 00:06:55.584 "r_mbytes_per_sec": 0, 00:06:55.584 "w_mbytes_per_sec": 0 00:06:55.584 }, 00:06:55.584 "claimed": false, 00:06:55.584 "zoned": false, 00:06:55.584 "supported_io_types": { 00:06:55.584 "read": true, 00:06:55.584 "write": true, 00:06:55.584 "unmap": true, 00:06:55.584 "flush": true, 00:06:55.584 "reset": true, 00:06:55.584 "nvme_admin": false, 00:06:55.584 "nvme_io": false, 00:06:55.584 "nvme_io_md": false, 00:06:55.584 "write_zeroes": true, 00:06:55.584 "zcopy": true, 00:06:55.584 "get_zone_info": false, 00:06:55.584 "zone_management": false, 00:06:55.584 "zone_append": false, 00:06:55.584 "compare": false, 00:06:55.584 "compare_and_write": false, 00:06:55.584 "abort": true, 00:06:55.584 "seek_hole": false, 00:06:55.584 "seek_data": false, 00:06:55.584 "copy": true, 00:06:55.584 "nvme_iov_md": false 00:06:55.584 }, 00:06:55.584 "memory_domains": [ 00:06:55.584 { 00:06:55.584 "dma_device_id": "system", 00:06:55.584 "dma_device_type": 1 00:06:55.584 }, 00:06:55.584 { 00:06:55.584 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:55.584 "dma_device_type": 2 00:06:55.584 } 00:06:55.584 ], 00:06:55.584 "driver_specific": { 00:06:55.584 "passthru": { 00:06:55.584 "name": "Passthru0", 00:06:55.584 "base_bdev_name": "Malloc2" 00:06:55.584 } 00:06:55.584 } 00:06:55.584 } 00:06:55.584 ]' 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:55.584 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:55.585 04:23:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:55.585 00:06:55.585 real 0m0.287s 00:06:55.585 user 0m0.179s 00:06:55.585 sys 0m0.047s 00:06:55.585 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:55.585 04:23:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:55.585 ************************************ 00:06:55.585 END TEST rpc_daemon_integrity 00:06:55.585 ************************************ 00:06:55.845 04:23:34 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:55.845 04:23:34 rpc -- rpc/rpc.sh@84 -- # killprocess 134843 00:06:55.845 04:23:34 rpc -- common/autotest_common.sh@954 -- # '[' -z 134843 ']' 00:06:55.845 04:23:34 rpc -- common/autotest_common.sh@958 -- # kill -0 134843 00:06:55.845 04:23:34 rpc -- common/autotest_common.sh@959 -- # uname 00:06:55.845 04:23:34 rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:55.845 04:23:34 rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 134843 00:06:55.845 04:23:34 rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:55.845 04:23:34 rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:55.845 04:23:34 rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 134843' 00:06:55.845 killing process with pid 134843 00:06:55.845 04:23:34 rpc -- common/autotest_common.sh@973 -- # kill 134843 00:06:55.845 04:23:34 rpc -- common/autotest_common.sh@978 -- # wait 134843 00:06:56.106 00:06:56.106 real 0m2.147s 00:06:56.106 user 0m2.687s 00:06:56.106 sys 0m0.843s 00:06:56.106 04:23:34 rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:56.106 04:23:34 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:56.106 ************************************ 00:06:56.106 END TEST rpc 00:06:56.106 ************************************ 00:06:56.106 04:23:34 -- spdk/autotest.sh@157 -- # run_test skip_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:56.106 04:23:34 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:56.106 04:23:34 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:56.106 04:23:34 -- common/autotest_common.sh@10 -- # set +x 00:06:56.106 ************************************ 00:06:56.106 START TEST skip_rpc 00:06:56.106 ************************************ 00:06:56.106 04:23:34 skip_rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:56.366 * Looking for test storage... 00:06:56.366 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:56.366 04:23:34 skip_rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:56.366 04:23:34 skip_rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:06:56.366 04:23:34 skip_rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:56.366 04:23:35 skip_rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@345 -- # : 1 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@365 -- # decimal 1 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@353 -- # local d=1 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@355 -- # echo 1 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@366 -- # decimal 2 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@353 -- # local d=2 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@355 -- # echo 2 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:56.366 04:23:35 skip_rpc -- scripts/common.sh@368 -- # return 0 00:06:56.367 04:23:35 skip_rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:56.367 04:23:35 skip_rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:56.367 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.367 --rc genhtml_branch_coverage=1 00:06:56.367 --rc genhtml_function_coverage=1 00:06:56.367 --rc genhtml_legend=1 00:06:56.367 --rc geninfo_all_blocks=1 00:06:56.367 --rc geninfo_unexecuted_blocks=1 00:06:56.367 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.367 ' 00:06:56.367 04:23:35 skip_rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:56.367 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.367 --rc genhtml_branch_coverage=1 00:06:56.367 --rc genhtml_function_coverage=1 00:06:56.367 --rc genhtml_legend=1 00:06:56.367 --rc geninfo_all_blocks=1 00:06:56.367 --rc geninfo_unexecuted_blocks=1 00:06:56.367 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.367 ' 00:06:56.367 04:23:35 skip_rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:56.367 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.367 --rc genhtml_branch_coverage=1 00:06:56.367 --rc genhtml_function_coverage=1 00:06:56.367 --rc genhtml_legend=1 00:06:56.367 --rc geninfo_all_blocks=1 00:06:56.367 --rc geninfo_unexecuted_blocks=1 00:06:56.367 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.367 ' 00:06:56.367 04:23:35 skip_rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:56.367 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.367 --rc genhtml_branch_coverage=1 00:06:56.367 --rc genhtml_function_coverage=1 00:06:56.367 --rc genhtml_legend=1 00:06:56.367 --rc geninfo_all_blocks=1 00:06:56.367 --rc geninfo_unexecuted_blocks=1 00:06:56.367 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.367 ' 00:06:56.367 04:23:35 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:56.367 04:23:35 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:56.367 04:23:35 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:56.367 04:23:35 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:56.367 04:23:35 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:56.367 04:23:35 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:56.367 ************************************ 00:06:56.367 START TEST skip_rpc 00:06:56.367 ************************************ 00:06:56.367 04:23:35 skip_rpc.skip_rpc -- common/autotest_common.sh@1129 -- # test_skip_rpc 00:06:56.367 04:23:35 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=135303 00:06:56.367 04:23:35 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:56.367 04:23:35 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:56.367 04:23:35 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:56.367 [2024-11-17 04:23:35.113921] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:06:56.367 [2024-11-17 04:23:35.113988] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid135303 ] 00:06:56.627 [2024-11-17 04:23:35.197066] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.627 [2024-11-17 04:23:35.218840] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.908 04:23:40 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:07:01.908 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # local es=0 00:07:01.908 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@654 -- # valid_exec_arg rpc_cmd spdk_get_version 00:07:01.908 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # local arg=rpc_cmd 00:07:01.908 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:01.908 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # type -t rpc_cmd 00:07:01.908 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:01.908 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@655 -- # rpc_cmd spdk_get_version 00:07:01.908 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:01.908 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:01.908 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:07:01.908 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@655 -- # es=1 00:07:01.908 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:01.908 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:01.909 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:01.909 04:23:40 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:07:01.909 04:23:40 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 135303 00:07:01.909 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # '[' -z 135303 ']' 00:07:01.909 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # kill -0 135303 00:07:01.909 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # uname 00:07:01.909 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:01.909 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 135303 00:07:01.909 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:01.909 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:01.909 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 135303' 00:07:01.909 killing process with pid 135303 00:07:01.909 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@973 -- # kill 135303 00:07:01.909 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@978 -- # wait 135303 00:07:01.909 00:07:01.909 real 0m5.369s 00:07:01.909 user 0m5.115s 00:07:01.909 sys 0m0.305s 00:07:01.909 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:01.909 04:23:40 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:01.909 ************************************ 00:07:01.909 END TEST skip_rpc 00:07:01.909 ************************************ 00:07:01.909 04:23:40 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:07:01.909 04:23:40 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:01.909 04:23:40 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:01.909 04:23:40 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:01.909 ************************************ 00:07:01.909 START TEST skip_rpc_with_json 00:07:01.909 ************************************ 00:07:01.909 04:23:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1129 -- # test_skip_rpc_with_json 00:07:01.909 04:23:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:07:01.909 04:23:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=136380 00:07:01.909 04:23:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:01.909 04:23:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:01.909 04:23:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 136380 00:07:01.909 04:23:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # '[' -z 136380 ']' 00:07:01.909 04:23:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:01.909 04:23:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:01.909 04:23:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:01.909 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:01.909 04:23:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:01.909 04:23:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:01.909 [2024-11-17 04:23:40.571713] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:01.909 [2024-11-17 04:23:40.571779] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid136380 ] 00:07:01.909 [2024-11-17 04:23:40.656115] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.909 [2024-11-17 04:23:40.678061] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.169 04:23:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:02.169 04:23:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@868 -- # return 0 00:07:02.169 04:23:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:07:02.169 04:23:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:02.169 04:23:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:02.169 [2024-11-17 04:23:40.885549] nvmf_rpc.c:2703:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:07:02.169 request: 00:07:02.169 { 00:07:02.169 "trtype": "tcp", 00:07:02.169 "method": "nvmf_get_transports", 00:07:02.169 "req_id": 1 00:07:02.169 } 00:07:02.169 Got JSON-RPC error response 00:07:02.169 response: 00:07:02.169 { 00:07:02.169 "code": -19, 00:07:02.169 "message": "No such device" 00:07:02.169 } 00:07:02.169 04:23:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:07:02.169 04:23:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:07:02.169 04:23:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:02.169 04:23:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:02.169 [2024-11-17 04:23:40.897633] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:02.169 04:23:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:02.169 04:23:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:07:02.169 04:23:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:02.169 04:23:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:02.514 04:23:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:02.514 04:23:41 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:07:02.514 { 00:07:02.514 "subsystems": [ 00:07:02.514 { 00:07:02.514 "subsystem": "scheduler", 00:07:02.514 "config": [ 00:07:02.514 { 00:07:02.514 "method": "framework_set_scheduler", 00:07:02.514 "params": { 00:07:02.514 "name": "static" 00:07:02.514 } 00:07:02.514 } 00:07:02.514 ] 00:07:02.514 }, 00:07:02.514 { 00:07:02.514 "subsystem": "vmd", 00:07:02.514 "config": [] 00:07:02.514 }, 00:07:02.514 { 00:07:02.514 "subsystem": "sock", 00:07:02.514 "config": [ 00:07:02.514 { 00:07:02.514 "method": "sock_set_default_impl", 00:07:02.514 "params": { 00:07:02.514 "impl_name": "posix" 00:07:02.514 } 00:07:02.514 }, 00:07:02.514 { 00:07:02.514 "method": "sock_impl_set_options", 00:07:02.514 "params": { 00:07:02.514 "impl_name": "ssl", 00:07:02.514 "recv_buf_size": 4096, 00:07:02.514 "send_buf_size": 4096, 00:07:02.514 "enable_recv_pipe": true, 00:07:02.514 "enable_quickack": false, 00:07:02.514 "enable_placement_id": 0, 00:07:02.514 "enable_zerocopy_send_server": true, 00:07:02.514 "enable_zerocopy_send_client": false, 00:07:02.514 "zerocopy_threshold": 0, 00:07:02.514 "tls_version": 0, 00:07:02.514 "enable_ktls": false 00:07:02.514 } 00:07:02.514 }, 00:07:02.514 { 00:07:02.514 "method": "sock_impl_set_options", 00:07:02.514 "params": { 00:07:02.514 "impl_name": "posix", 00:07:02.514 "recv_buf_size": 2097152, 00:07:02.514 "send_buf_size": 2097152, 00:07:02.514 "enable_recv_pipe": true, 00:07:02.514 "enable_quickack": false, 00:07:02.514 "enable_placement_id": 0, 00:07:02.514 "enable_zerocopy_send_server": true, 00:07:02.514 "enable_zerocopy_send_client": false, 00:07:02.514 "zerocopy_threshold": 0, 00:07:02.514 "tls_version": 0, 00:07:02.514 "enable_ktls": false 00:07:02.514 } 00:07:02.514 } 00:07:02.514 ] 00:07:02.514 }, 00:07:02.514 { 00:07:02.514 "subsystem": "iobuf", 00:07:02.514 "config": [ 00:07:02.514 { 00:07:02.514 "method": "iobuf_set_options", 00:07:02.514 "params": { 00:07:02.514 "small_pool_count": 8192, 00:07:02.514 "large_pool_count": 1024, 00:07:02.514 "small_bufsize": 8192, 00:07:02.514 "large_bufsize": 135168, 00:07:02.514 "enable_numa": false 00:07:02.514 } 00:07:02.514 } 00:07:02.514 ] 00:07:02.514 }, 00:07:02.514 { 00:07:02.514 "subsystem": "keyring", 00:07:02.514 "config": [] 00:07:02.514 }, 00:07:02.514 { 00:07:02.514 "subsystem": "vfio_user_target", 00:07:02.514 "config": null 00:07:02.514 }, 00:07:02.514 { 00:07:02.514 "subsystem": "fsdev", 00:07:02.514 "config": [ 00:07:02.514 { 00:07:02.514 "method": "fsdev_set_opts", 00:07:02.514 "params": { 00:07:02.515 "fsdev_io_pool_size": 65535, 00:07:02.515 "fsdev_io_cache_size": 256 00:07:02.515 } 00:07:02.515 } 00:07:02.515 ] 00:07:02.515 }, 00:07:02.515 { 00:07:02.515 "subsystem": "accel", 00:07:02.515 "config": [ 00:07:02.515 { 00:07:02.515 "method": "accel_set_options", 00:07:02.515 "params": { 00:07:02.515 "small_cache_size": 128, 00:07:02.515 "large_cache_size": 16, 00:07:02.515 "task_count": 2048, 00:07:02.515 "sequence_count": 2048, 00:07:02.515 "buf_count": 2048 00:07:02.515 } 00:07:02.515 } 00:07:02.515 ] 00:07:02.515 }, 00:07:02.515 { 00:07:02.515 "subsystem": "bdev", 00:07:02.515 "config": [ 00:07:02.515 { 00:07:02.515 "method": "bdev_set_options", 00:07:02.515 "params": { 00:07:02.515 "bdev_io_pool_size": 65535, 00:07:02.515 "bdev_io_cache_size": 256, 00:07:02.515 "bdev_auto_examine": true, 00:07:02.515 "iobuf_small_cache_size": 128, 00:07:02.515 "iobuf_large_cache_size": 16 00:07:02.515 } 00:07:02.515 }, 00:07:02.515 { 00:07:02.515 "method": "bdev_raid_set_options", 00:07:02.515 "params": { 00:07:02.515 "process_window_size_kb": 1024, 00:07:02.515 "process_max_bandwidth_mb_sec": 0 00:07:02.515 } 00:07:02.515 }, 00:07:02.515 { 00:07:02.515 "method": "bdev_nvme_set_options", 00:07:02.515 "params": { 00:07:02.515 "action_on_timeout": "none", 00:07:02.515 "timeout_us": 0, 00:07:02.515 "timeout_admin_us": 0, 00:07:02.515 "keep_alive_timeout_ms": 10000, 00:07:02.515 "arbitration_burst": 0, 00:07:02.515 "low_priority_weight": 0, 00:07:02.515 "medium_priority_weight": 0, 00:07:02.515 "high_priority_weight": 0, 00:07:02.515 "nvme_adminq_poll_period_us": 10000, 00:07:02.515 "nvme_ioq_poll_period_us": 0, 00:07:02.515 "io_queue_requests": 0, 00:07:02.515 "delay_cmd_submit": true, 00:07:02.515 "transport_retry_count": 4, 00:07:02.515 "bdev_retry_count": 3, 00:07:02.515 "transport_ack_timeout": 0, 00:07:02.515 "ctrlr_loss_timeout_sec": 0, 00:07:02.515 "reconnect_delay_sec": 0, 00:07:02.515 "fast_io_fail_timeout_sec": 0, 00:07:02.515 "disable_auto_failback": false, 00:07:02.515 "generate_uuids": false, 00:07:02.515 "transport_tos": 0, 00:07:02.515 "nvme_error_stat": false, 00:07:02.515 "rdma_srq_size": 0, 00:07:02.515 "io_path_stat": false, 00:07:02.515 "allow_accel_sequence": false, 00:07:02.515 "rdma_max_cq_size": 0, 00:07:02.515 "rdma_cm_event_timeout_ms": 0, 00:07:02.515 "dhchap_digests": [ 00:07:02.515 "sha256", 00:07:02.515 "sha384", 00:07:02.515 "sha512" 00:07:02.515 ], 00:07:02.515 "dhchap_dhgroups": [ 00:07:02.515 "null", 00:07:02.515 "ffdhe2048", 00:07:02.515 "ffdhe3072", 00:07:02.515 "ffdhe4096", 00:07:02.515 "ffdhe6144", 00:07:02.515 "ffdhe8192" 00:07:02.515 ] 00:07:02.515 } 00:07:02.515 }, 00:07:02.515 { 00:07:02.515 "method": "bdev_nvme_set_hotplug", 00:07:02.515 "params": { 00:07:02.515 "period_us": 100000, 00:07:02.515 "enable": false 00:07:02.515 } 00:07:02.515 }, 00:07:02.515 { 00:07:02.515 "method": "bdev_iscsi_set_options", 00:07:02.515 "params": { 00:07:02.515 "timeout_sec": 30 00:07:02.515 } 00:07:02.515 }, 00:07:02.515 { 00:07:02.515 "method": "bdev_wait_for_examine" 00:07:02.515 } 00:07:02.515 ] 00:07:02.515 }, 00:07:02.515 { 00:07:02.515 "subsystem": "nvmf", 00:07:02.515 "config": [ 00:07:02.515 { 00:07:02.515 "method": "nvmf_set_config", 00:07:02.515 "params": { 00:07:02.515 "discovery_filter": "match_any", 00:07:02.515 "admin_cmd_passthru": { 00:07:02.515 "identify_ctrlr": false 00:07:02.515 }, 00:07:02.515 "dhchap_digests": [ 00:07:02.515 "sha256", 00:07:02.515 "sha384", 00:07:02.515 "sha512" 00:07:02.515 ], 00:07:02.515 "dhchap_dhgroups": [ 00:07:02.515 "null", 00:07:02.515 "ffdhe2048", 00:07:02.515 "ffdhe3072", 00:07:02.515 "ffdhe4096", 00:07:02.515 "ffdhe6144", 00:07:02.515 "ffdhe8192" 00:07:02.515 ] 00:07:02.515 } 00:07:02.515 }, 00:07:02.515 { 00:07:02.515 "method": "nvmf_set_max_subsystems", 00:07:02.515 "params": { 00:07:02.515 "max_subsystems": 1024 00:07:02.515 } 00:07:02.515 }, 00:07:02.515 { 00:07:02.515 "method": "nvmf_set_crdt", 00:07:02.515 "params": { 00:07:02.515 "crdt1": 0, 00:07:02.515 "crdt2": 0, 00:07:02.515 "crdt3": 0 00:07:02.515 } 00:07:02.515 }, 00:07:02.515 { 00:07:02.515 "method": "nvmf_create_transport", 00:07:02.515 "params": { 00:07:02.515 "trtype": "TCP", 00:07:02.515 "max_queue_depth": 128, 00:07:02.515 "max_io_qpairs_per_ctrlr": 127, 00:07:02.515 "in_capsule_data_size": 4096, 00:07:02.515 "max_io_size": 131072, 00:07:02.515 "io_unit_size": 131072, 00:07:02.515 "max_aq_depth": 128, 00:07:02.515 "num_shared_buffers": 511, 00:07:02.515 "buf_cache_size": 4294967295, 00:07:02.515 "dif_insert_or_strip": false, 00:07:02.515 "zcopy": false, 00:07:02.515 "c2h_success": true, 00:07:02.515 "sock_priority": 0, 00:07:02.515 "abort_timeout_sec": 1, 00:07:02.515 "ack_timeout": 0, 00:07:02.515 "data_wr_pool_size": 0 00:07:02.515 } 00:07:02.515 } 00:07:02.515 ] 00:07:02.515 }, 00:07:02.515 { 00:07:02.515 "subsystem": "nbd", 00:07:02.515 "config": [] 00:07:02.515 }, 00:07:02.515 { 00:07:02.515 "subsystem": "ublk", 00:07:02.515 "config": [] 00:07:02.515 }, 00:07:02.515 { 00:07:02.515 "subsystem": "vhost_blk", 00:07:02.515 "config": [] 00:07:02.515 }, 00:07:02.515 { 00:07:02.515 "subsystem": "scsi", 00:07:02.515 "config": null 00:07:02.515 }, 00:07:02.515 { 00:07:02.515 "subsystem": "iscsi", 00:07:02.515 "config": [ 00:07:02.515 { 00:07:02.515 "method": "iscsi_set_options", 00:07:02.515 "params": { 00:07:02.515 "node_base": "iqn.2016-06.io.spdk", 00:07:02.515 "max_sessions": 128, 00:07:02.515 "max_connections_per_session": 2, 00:07:02.515 "max_queue_depth": 64, 00:07:02.516 "default_time2wait": 2, 00:07:02.516 "default_time2retain": 20, 00:07:02.516 "first_burst_length": 8192, 00:07:02.516 "immediate_data": true, 00:07:02.516 "allow_duplicated_isid": false, 00:07:02.516 "error_recovery_level": 0, 00:07:02.516 "nop_timeout": 60, 00:07:02.516 "nop_in_interval": 30, 00:07:02.516 "disable_chap": false, 00:07:02.516 "require_chap": false, 00:07:02.516 "mutual_chap": false, 00:07:02.516 "chap_group": 0, 00:07:02.516 "max_large_datain_per_connection": 64, 00:07:02.516 "max_r2t_per_connection": 4, 00:07:02.516 "pdu_pool_size": 36864, 00:07:02.516 "immediate_data_pool_size": 16384, 00:07:02.516 "data_out_pool_size": 2048 00:07:02.516 } 00:07:02.516 } 00:07:02.516 ] 00:07:02.516 }, 00:07:02.516 { 00:07:02.516 "subsystem": "vhost_scsi", 00:07:02.516 "config": [] 00:07:02.516 } 00:07:02.516 ] 00:07:02.516 } 00:07:02.516 04:23:41 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:07:02.516 04:23:41 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 136380 00:07:02.516 04:23:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' -z 136380 ']' 00:07:02.516 04:23:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # kill -0 136380 00:07:02.516 04:23:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # uname 00:07:02.516 04:23:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:02.516 04:23:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 136380 00:07:02.516 04:23:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:02.516 04:23:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:02.516 04:23:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # echo 'killing process with pid 136380' 00:07:02.516 killing process with pid 136380 00:07:02.516 04:23:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # kill 136380 00:07:02.516 04:23:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@978 -- # wait 136380 00:07:02.776 04:23:41 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=136405 00:07:02.776 04:23:41 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:07:02.776 04:23:41 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 136405 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' -z 136405 ']' 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # kill -0 136405 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # uname 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 136405 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # echo 'killing process with pid 136405' 00:07:08.059 killing process with pid 136405 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # kill 136405 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@978 -- # wait 136405 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:07:08.059 00:07:08.059 real 0m6.250s 00:07:08.059 user 0m5.922s 00:07:08.059 sys 0m0.681s 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:08.059 ************************************ 00:07:08.059 END TEST skip_rpc_with_json 00:07:08.059 ************************************ 00:07:08.059 04:23:46 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:07:08.059 04:23:46 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:08.059 04:23:46 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:08.059 04:23:46 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:08.059 ************************************ 00:07:08.059 START TEST skip_rpc_with_delay 00:07:08.059 ************************************ 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1129 -- # test_skip_rpc_with_delay 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # local es=0 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:08.059 04:23:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:08.319 [2024-11-17 04:23:46.904664] app.c: 842:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:07:08.319 04:23:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@655 -- # es=1 00:07:08.319 04:23:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:08.319 04:23:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:08.319 04:23:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:08.319 00:07:08.319 real 0m0.046s 00:07:08.319 user 0m0.023s 00:07:08.319 sys 0m0.023s 00:07:08.319 04:23:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:08.319 04:23:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:07:08.319 ************************************ 00:07:08.319 END TEST skip_rpc_with_delay 00:07:08.319 ************************************ 00:07:08.319 04:23:46 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:07:08.319 04:23:46 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:07:08.319 04:23:46 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:07:08.319 04:23:46 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:08.319 04:23:46 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:08.319 04:23:46 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:08.319 ************************************ 00:07:08.319 START TEST exit_on_failed_rpc_init 00:07:08.319 ************************************ 00:07:08.319 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1129 -- # test_exit_on_failed_rpc_init 00:07:08.319 04:23:47 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=137519 00:07:08.319 04:23:47 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 137519 00:07:08.319 04:23:47 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:08.319 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # '[' -z 137519 ']' 00:07:08.319 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:08.319 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:08.319 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:08.319 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:08.319 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:08.319 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:08.319 [2024-11-17 04:23:47.040732] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:08.319 [2024-11-17 04:23:47.040797] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid137519 ] 00:07:08.319 [2024-11-17 04:23:47.126313] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.579 [2024-11-17 04:23:47.149222] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.579 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:08.579 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@868 -- # return 0 00:07:08.579 04:23:47 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:08.579 04:23:47 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:08.579 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # local es=0 00:07:08.579 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:08.579 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:08.579 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:08.579 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:08.579 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:08.579 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:08.579 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:08.579 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:08.579 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:08.580 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:08.580 [2024-11-17 04:23:47.387786] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:08.580 [2024-11-17 04:23:47.387860] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid137531 ] 00:07:08.840 [2024-11-17 04:23:47.470300] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.840 [2024-11-17 04:23:47.492286] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:08.840 [2024-11-17 04:23:47.492379] rpc.c: 181:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:07:08.840 [2024-11-17 04:23:47.492392] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:07:08.840 [2024-11-17 04:23:47.492401] app.c:1064:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:08.840 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@655 -- # es=234 00:07:08.840 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:08.840 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@664 -- # es=106 00:07:08.840 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@665 -- # case "$es" in 00:07:08.840 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@672 -- # es=1 00:07:08.840 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:08.840 04:23:47 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:07:08.840 04:23:47 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 137519 00:07:08.840 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # '[' -z 137519 ']' 00:07:08.840 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # kill -0 137519 00:07:08.840 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # uname 00:07:08.840 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:08.840 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 137519 00:07:08.840 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:08.840 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:08.840 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # echo 'killing process with pid 137519' 00:07:08.840 killing process with pid 137519 00:07:08.840 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@973 -- # kill 137519 00:07:08.840 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@978 -- # wait 137519 00:07:09.101 00:07:09.101 real 0m0.854s 00:07:09.101 user 0m0.830s 00:07:09.101 sys 0m0.438s 00:07:09.101 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:09.101 04:23:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:09.101 ************************************ 00:07:09.101 END TEST exit_on_failed_rpc_init 00:07:09.101 ************************************ 00:07:09.101 04:23:47 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:07:09.101 00:07:09.101 real 0m13.061s 00:07:09.101 user 0m12.123s 00:07:09.101 sys 0m1.798s 00:07:09.101 04:23:47 skip_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:09.101 04:23:47 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:09.101 ************************************ 00:07:09.101 END TEST skip_rpc 00:07:09.101 ************************************ 00:07:09.361 04:23:47 -- spdk/autotest.sh@158 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:09.361 04:23:47 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:09.361 04:23:47 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:09.361 04:23:47 -- common/autotest_common.sh@10 -- # set +x 00:07:09.361 ************************************ 00:07:09.361 START TEST rpc_client 00:07:09.361 ************************************ 00:07:09.361 04:23:47 rpc_client -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:09.361 * Looking for test storage... 00:07:09.361 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:07:09.361 04:23:48 rpc_client -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:09.361 04:23:48 rpc_client -- common/autotest_common.sh@1693 -- # lcov --version 00:07:09.361 04:23:48 rpc_client -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:09.361 04:23:48 rpc_client -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@336 -- # IFS=.-: 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@336 -- # read -ra ver1 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@337 -- # IFS=.-: 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@337 -- # read -ra ver2 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@338 -- # local 'op=<' 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@340 -- # ver1_l=2 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@341 -- # ver2_l=1 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@344 -- # case "$op" in 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@345 -- # : 1 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@365 -- # decimal 1 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@353 -- # local d=1 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@355 -- # echo 1 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@365 -- # ver1[v]=1 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@366 -- # decimal 2 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@353 -- # local d=2 00:07:09.361 04:23:48 rpc_client -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:09.621 04:23:48 rpc_client -- scripts/common.sh@355 -- # echo 2 00:07:09.621 04:23:48 rpc_client -- scripts/common.sh@366 -- # ver2[v]=2 00:07:09.621 04:23:48 rpc_client -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:09.621 04:23:48 rpc_client -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:09.621 04:23:48 rpc_client -- scripts/common.sh@368 -- # return 0 00:07:09.621 04:23:48 rpc_client -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:09.621 04:23:48 rpc_client -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:09.621 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:09.621 --rc genhtml_branch_coverage=1 00:07:09.621 --rc genhtml_function_coverage=1 00:07:09.621 --rc genhtml_legend=1 00:07:09.621 --rc geninfo_all_blocks=1 00:07:09.621 --rc geninfo_unexecuted_blocks=1 00:07:09.622 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:09.622 ' 00:07:09.622 04:23:48 rpc_client -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:09.622 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:09.622 --rc genhtml_branch_coverage=1 00:07:09.622 --rc genhtml_function_coverage=1 00:07:09.622 --rc genhtml_legend=1 00:07:09.622 --rc geninfo_all_blocks=1 00:07:09.622 --rc geninfo_unexecuted_blocks=1 00:07:09.622 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:09.622 ' 00:07:09.622 04:23:48 rpc_client -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:09.622 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:09.622 --rc genhtml_branch_coverage=1 00:07:09.622 --rc genhtml_function_coverage=1 00:07:09.622 --rc genhtml_legend=1 00:07:09.622 --rc geninfo_all_blocks=1 00:07:09.622 --rc geninfo_unexecuted_blocks=1 00:07:09.622 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:09.622 ' 00:07:09.622 04:23:48 rpc_client -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:09.622 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:09.622 --rc genhtml_branch_coverage=1 00:07:09.622 --rc genhtml_function_coverage=1 00:07:09.622 --rc genhtml_legend=1 00:07:09.622 --rc geninfo_all_blocks=1 00:07:09.622 --rc geninfo_unexecuted_blocks=1 00:07:09.622 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:09.622 ' 00:07:09.622 04:23:48 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:07:09.622 OK 00:07:09.622 04:23:48 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:07:09.622 00:07:09.622 real 0m0.218s 00:07:09.622 user 0m0.103s 00:07:09.622 sys 0m0.132s 00:07:09.622 04:23:48 rpc_client -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:09.622 04:23:48 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:07:09.622 ************************************ 00:07:09.622 END TEST rpc_client 00:07:09.622 ************************************ 00:07:09.622 04:23:48 -- spdk/autotest.sh@159 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:07:09.622 04:23:48 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:09.622 04:23:48 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:09.622 04:23:48 -- common/autotest_common.sh@10 -- # set +x 00:07:09.622 ************************************ 00:07:09.622 START TEST json_config 00:07:09.622 ************************************ 00:07:09.622 04:23:48 json_config -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:07:09.622 04:23:48 json_config -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:09.622 04:23:48 json_config -- common/autotest_common.sh@1693 -- # lcov --version 00:07:09.622 04:23:48 json_config -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:09.622 04:23:48 json_config -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:09.622 04:23:48 json_config -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:09.622 04:23:48 json_config -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:09.882 04:23:48 json_config -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:09.882 04:23:48 json_config -- scripts/common.sh@336 -- # IFS=.-: 00:07:09.882 04:23:48 json_config -- scripts/common.sh@336 -- # read -ra ver1 00:07:09.882 04:23:48 json_config -- scripts/common.sh@337 -- # IFS=.-: 00:07:09.882 04:23:48 json_config -- scripts/common.sh@337 -- # read -ra ver2 00:07:09.882 04:23:48 json_config -- scripts/common.sh@338 -- # local 'op=<' 00:07:09.882 04:23:48 json_config -- scripts/common.sh@340 -- # ver1_l=2 00:07:09.882 04:23:48 json_config -- scripts/common.sh@341 -- # ver2_l=1 00:07:09.882 04:23:48 json_config -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:09.882 04:23:48 json_config -- scripts/common.sh@344 -- # case "$op" in 00:07:09.882 04:23:48 json_config -- scripts/common.sh@345 -- # : 1 00:07:09.882 04:23:48 json_config -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:09.882 04:23:48 json_config -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:09.882 04:23:48 json_config -- scripts/common.sh@365 -- # decimal 1 00:07:09.882 04:23:48 json_config -- scripts/common.sh@353 -- # local d=1 00:07:09.882 04:23:48 json_config -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:09.882 04:23:48 json_config -- scripts/common.sh@355 -- # echo 1 00:07:09.882 04:23:48 json_config -- scripts/common.sh@365 -- # ver1[v]=1 00:07:09.882 04:23:48 json_config -- scripts/common.sh@366 -- # decimal 2 00:07:09.882 04:23:48 json_config -- scripts/common.sh@353 -- # local d=2 00:07:09.882 04:23:48 json_config -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:09.882 04:23:48 json_config -- scripts/common.sh@355 -- # echo 2 00:07:09.882 04:23:48 json_config -- scripts/common.sh@366 -- # ver2[v]=2 00:07:09.882 04:23:48 json_config -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:09.882 04:23:48 json_config -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:09.882 04:23:48 json_config -- scripts/common.sh@368 -- # return 0 00:07:09.882 04:23:48 json_config -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:09.882 04:23:48 json_config -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:09.882 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:09.882 --rc genhtml_branch_coverage=1 00:07:09.882 --rc genhtml_function_coverage=1 00:07:09.882 --rc genhtml_legend=1 00:07:09.882 --rc geninfo_all_blocks=1 00:07:09.882 --rc geninfo_unexecuted_blocks=1 00:07:09.882 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:09.882 ' 00:07:09.882 04:23:48 json_config -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:09.882 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:09.882 --rc genhtml_branch_coverage=1 00:07:09.882 --rc genhtml_function_coverage=1 00:07:09.882 --rc genhtml_legend=1 00:07:09.882 --rc geninfo_all_blocks=1 00:07:09.882 --rc geninfo_unexecuted_blocks=1 00:07:09.882 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:09.882 ' 00:07:09.882 04:23:48 json_config -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:09.883 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:09.883 --rc genhtml_branch_coverage=1 00:07:09.883 --rc genhtml_function_coverage=1 00:07:09.883 --rc genhtml_legend=1 00:07:09.883 --rc geninfo_all_blocks=1 00:07:09.883 --rc geninfo_unexecuted_blocks=1 00:07:09.883 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:09.883 ' 00:07:09.883 04:23:48 json_config -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:09.883 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:09.883 --rc genhtml_branch_coverage=1 00:07:09.883 --rc genhtml_function_coverage=1 00:07:09.883 --rc genhtml_legend=1 00:07:09.883 --rc geninfo_all_blocks=1 00:07:09.883 --rc geninfo_unexecuted_blocks=1 00:07:09.883 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:09.883 ' 00:07:09.883 04:23:48 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@7 -- # uname -s 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:09.883 04:23:48 json_config -- scripts/common.sh@15 -- # shopt -s extglob 00:07:09.883 04:23:48 json_config -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:09.883 04:23:48 json_config -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:09.883 04:23:48 json_config -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:09.883 04:23:48 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:09.883 04:23:48 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:09.883 04:23:48 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:09.883 04:23:48 json_config -- paths/export.sh@5 -- # export PATH 00:07:09.883 04:23:48 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@51 -- # : 0 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:07:09.883 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:07:09.883 04:23:48 json_config -- nvmf/common.sh@55 -- # have_pci_nics=0 00:07:09.883 04:23:48 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:07:09.883 04:23:48 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:07:09.883 04:23:48 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:07:09.883 04:23:48 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:07:09.883 04:23:48 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:07:09.883 04:23:48 json_config -- json_config/json_config.sh@27 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:07:09.883 WARNING: No tests are enabled so not running JSON configuration tests 00:07:09.883 04:23:48 json_config -- json_config/json_config.sh@28 -- # exit 0 00:07:09.883 00:07:09.883 real 0m0.202s 00:07:09.883 user 0m0.120s 00:07:09.883 sys 0m0.091s 00:07:09.883 04:23:48 json_config -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:09.883 04:23:48 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:09.883 ************************************ 00:07:09.883 END TEST json_config 00:07:09.883 ************************************ 00:07:09.883 04:23:48 -- spdk/autotest.sh@160 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:09.883 04:23:48 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:09.883 04:23:48 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:09.883 04:23:48 -- common/autotest_common.sh@10 -- # set +x 00:07:09.883 ************************************ 00:07:09.883 START TEST json_config_extra_key 00:07:09.883 ************************************ 00:07:09.883 04:23:48 json_config_extra_key -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:09.883 04:23:48 json_config_extra_key -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:09.883 04:23:48 json_config_extra_key -- common/autotest_common.sh@1693 -- # lcov --version 00:07:09.883 04:23:48 json_config_extra_key -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:10.144 04:23:48 json_config_extra_key -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@336 -- # IFS=.-: 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@336 -- # read -ra ver1 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@337 -- # IFS=.-: 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@337 -- # read -ra ver2 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@338 -- # local 'op=<' 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@340 -- # ver1_l=2 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@341 -- # ver2_l=1 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@344 -- # case "$op" in 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@345 -- # : 1 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@365 -- # decimal 1 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@353 -- # local d=1 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@355 -- # echo 1 00:07:10.144 04:23:48 json_config_extra_key -- scripts/common.sh@365 -- # ver1[v]=1 00:07:10.145 04:23:48 json_config_extra_key -- scripts/common.sh@366 -- # decimal 2 00:07:10.145 04:23:48 json_config_extra_key -- scripts/common.sh@353 -- # local d=2 00:07:10.145 04:23:48 json_config_extra_key -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:10.145 04:23:48 json_config_extra_key -- scripts/common.sh@355 -- # echo 2 00:07:10.145 04:23:48 json_config_extra_key -- scripts/common.sh@366 -- # ver2[v]=2 00:07:10.145 04:23:48 json_config_extra_key -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:10.145 04:23:48 json_config_extra_key -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:10.145 04:23:48 json_config_extra_key -- scripts/common.sh@368 -- # return 0 00:07:10.145 04:23:48 json_config_extra_key -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:10.145 04:23:48 json_config_extra_key -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:10.145 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.145 --rc genhtml_branch_coverage=1 00:07:10.145 --rc genhtml_function_coverage=1 00:07:10.145 --rc genhtml_legend=1 00:07:10.145 --rc geninfo_all_blocks=1 00:07:10.145 --rc geninfo_unexecuted_blocks=1 00:07:10.145 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.145 ' 00:07:10.145 04:23:48 json_config_extra_key -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:10.145 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.145 --rc genhtml_branch_coverage=1 00:07:10.145 --rc genhtml_function_coverage=1 00:07:10.145 --rc genhtml_legend=1 00:07:10.145 --rc geninfo_all_blocks=1 00:07:10.145 --rc geninfo_unexecuted_blocks=1 00:07:10.145 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.145 ' 00:07:10.145 04:23:48 json_config_extra_key -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:10.145 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.145 --rc genhtml_branch_coverage=1 00:07:10.145 --rc genhtml_function_coverage=1 00:07:10.145 --rc genhtml_legend=1 00:07:10.145 --rc geninfo_all_blocks=1 00:07:10.145 --rc geninfo_unexecuted_blocks=1 00:07:10.145 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.145 ' 00:07:10.145 04:23:48 json_config_extra_key -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:10.145 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.145 --rc genhtml_branch_coverage=1 00:07:10.145 --rc genhtml_function_coverage=1 00:07:10.145 --rc genhtml_legend=1 00:07:10.145 --rc geninfo_all_blocks=1 00:07:10.145 --rc geninfo_unexecuted_blocks=1 00:07:10.145 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.145 ' 00:07:10.145 04:23:48 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:10.145 04:23:48 json_config_extra_key -- scripts/common.sh@15 -- # shopt -s extglob 00:07:10.145 04:23:48 json_config_extra_key -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:10.145 04:23:48 json_config_extra_key -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:10.145 04:23:48 json_config_extra_key -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:10.145 04:23:48 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:10.145 04:23:48 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:10.145 04:23:48 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:10.145 04:23:48 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:07:10.145 04:23:48 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@51 -- # : 0 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:07:10.145 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:07:10.145 04:23:48 json_config_extra_key -- nvmf/common.sh@55 -- # have_pci_nics=0 00:07:10.145 04:23:48 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:07:10.145 04:23:48 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:07:10.145 04:23:48 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:07:10.145 04:23:48 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:07:10.145 04:23:48 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:07:10.145 04:23:48 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:07:10.145 04:23:48 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:07:10.145 04:23:48 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:07:10.145 04:23:48 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:07:10.145 04:23:48 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:10.145 04:23:48 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:07:10.145 INFO: launching applications... 00:07:10.145 04:23:48 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:07:10.145 04:23:48 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:07:10.145 04:23:48 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:07:10.145 04:23:48 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:10.145 04:23:48 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:10.145 04:23:48 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:07:10.145 04:23:48 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:10.145 04:23:48 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:10.145 04:23:48 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=137961 00:07:10.145 04:23:48 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:10.145 Waiting for target to run... 00:07:10.145 04:23:48 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 137961 /var/tmp/spdk_tgt.sock 00:07:10.145 04:23:48 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:07:10.145 04:23:48 json_config_extra_key -- common/autotest_common.sh@835 -- # '[' -z 137961 ']' 00:07:10.145 04:23:48 json_config_extra_key -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:10.145 04:23:48 json_config_extra_key -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:10.145 04:23:48 json_config_extra_key -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:10.145 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:10.145 04:23:48 json_config_extra_key -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:10.145 04:23:48 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:10.145 [2024-11-17 04:23:48.813660] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:10.146 [2024-11-17 04:23:48.813743] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid137961 ] 00:07:10.414 [2024-11-17 04:23:49.110681] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.414 [2024-11-17 04:23:49.123409] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.985 04:23:49 json_config_extra_key -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:10.985 04:23:49 json_config_extra_key -- common/autotest_common.sh@868 -- # return 0 00:07:10.985 04:23:49 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:07:10.985 00:07:10.985 04:23:49 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:07:10.985 INFO: shutting down applications... 00:07:10.985 04:23:49 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:07:10.985 04:23:49 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:07:10.985 04:23:49 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:10.985 04:23:49 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 137961 ]] 00:07:10.985 04:23:49 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 137961 00:07:10.985 04:23:49 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:10.985 04:23:49 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:10.985 04:23:49 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 137961 00:07:10.985 04:23:49 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:07:11.556 04:23:50 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:07:11.556 04:23:50 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:11.556 04:23:50 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 137961 00:07:11.556 04:23:50 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:11.556 04:23:50 json_config_extra_key -- json_config/common.sh@43 -- # break 00:07:11.556 04:23:50 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:11.556 04:23:50 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:11.556 SPDK target shutdown done 00:07:11.556 04:23:50 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:07:11.556 Success 00:07:11.556 00:07:11.556 real 0m1.597s 00:07:11.556 user 0m1.327s 00:07:11.556 sys 0m0.441s 00:07:11.556 04:23:50 json_config_extra_key -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:11.556 04:23:50 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:11.556 ************************************ 00:07:11.556 END TEST json_config_extra_key 00:07:11.556 ************************************ 00:07:11.556 04:23:50 -- spdk/autotest.sh@161 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:11.556 04:23:50 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:11.556 04:23:50 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:11.556 04:23:50 -- common/autotest_common.sh@10 -- # set +x 00:07:11.556 ************************************ 00:07:11.556 START TEST alias_rpc 00:07:11.556 ************************************ 00:07:11.556 04:23:50 alias_rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:11.556 * Looking for test storage... 00:07:11.556 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:07:11.556 04:23:50 alias_rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:11.556 04:23:50 alias_rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:07:11.556 04:23:50 alias_rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:11.817 04:23:50 alias_rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@344 -- # case "$op" in 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@345 -- # : 1 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@365 -- # decimal 1 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@353 -- # local d=1 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@355 -- # echo 1 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@366 -- # decimal 2 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@353 -- # local d=2 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@355 -- # echo 2 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:11.817 04:23:50 alias_rpc -- scripts/common.sh@368 -- # return 0 00:07:11.817 04:23:50 alias_rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:11.817 04:23:50 alias_rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:11.817 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:11.817 --rc genhtml_branch_coverage=1 00:07:11.817 --rc genhtml_function_coverage=1 00:07:11.817 --rc genhtml_legend=1 00:07:11.817 --rc geninfo_all_blocks=1 00:07:11.817 --rc geninfo_unexecuted_blocks=1 00:07:11.817 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:11.817 ' 00:07:11.817 04:23:50 alias_rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:11.817 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:11.817 --rc genhtml_branch_coverage=1 00:07:11.817 --rc genhtml_function_coverage=1 00:07:11.817 --rc genhtml_legend=1 00:07:11.817 --rc geninfo_all_blocks=1 00:07:11.817 --rc geninfo_unexecuted_blocks=1 00:07:11.817 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:11.817 ' 00:07:11.817 04:23:50 alias_rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:11.817 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:11.817 --rc genhtml_branch_coverage=1 00:07:11.817 --rc genhtml_function_coverage=1 00:07:11.817 --rc genhtml_legend=1 00:07:11.817 --rc geninfo_all_blocks=1 00:07:11.817 --rc geninfo_unexecuted_blocks=1 00:07:11.817 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:11.817 ' 00:07:11.817 04:23:50 alias_rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:11.817 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:11.817 --rc genhtml_branch_coverage=1 00:07:11.817 --rc genhtml_function_coverage=1 00:07:11.817 --rc genhtml_legend=1 00:07:11.817 --rc geninfo_all_blocks=1 00:07:11.817 --rc geninfo_unexecuted_blocks=1 00:07:11.817 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:11.817 ' 00:07:11.817 04:23:50 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:11.817 04:23:50 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:11.817 04:23:50 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=138294 00:07:11.817 04:23:50 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 138294 00:07:11.817 04:23:50 alias_rpc -- common/autotest_common.sh@835 -- # '[' -z 138294 ']' 00:07:11.817 04:23:50 alias_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:11.817 04:23:50 alias_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:11.817 04:23:50 alias_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:11.817 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:11.817 04:23:50 alias_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:11.817 04:23:50 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:11.817 [2024-11-17 04:23:50.483962] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:11.817 [2024-11-17 04:23:50.484028] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid138294 ] 00:07:11.817 [2024-11-17 04:23:50.568021] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.817 [2024-11-17 04:23:50.590975] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.077 04:23:50 alias_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:12.077 04:23:50 alias_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:12.077 04:23:50 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:07:12.338 04:23:51 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 138294 00:07:12.338 04:23:51 alias_rpc -- common/autotest_common.sh@954 -- # '[' -z 138294 ']' 00:07:12.338 04:23:51 alias_rpc -- common/autotest_common.sh@958 -- # kill -0 138294 00:07:12.338 04:23:51 alias_rpc -- common/autotest_common.sh@959 -- # uname 00:07:12.338 04:23:51 alias_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:12.338 04:23:51 alias_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 138294 00:07:12.338 04:23:51 alias_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:12.338 04:23:51 alias_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:12.338 04:23:51 alias_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 138294' 00:07:12.338 killing process with pid 138294 00:07:12.338 04:23:51 alias_rpc -- common/autotest_common.sh@973 -- # kill 138294 00:07:12.338 04:23:51 alias_rpc -- common/autotest_common.sh@978 -- # wait 138294 00:07:12.597 00:07:12.597 real 0m1.087s 00:07:12.597 user 0m1.094s 00:07:12.597 sys 0m0.453s 00:07:12.597 04:23:51 alias_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:12.597 04:23:51 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:12.597 ************************************ 00:07:12.597 END TEST alias_rpc 00:07:12.597 ************************************ 00:07:12.597 04:23:51 -- spdk/autotest.sh@163 -- # [[ 0 -eq 0 ]] 00:07:12.597 04:23:51 -- spdk/autotest.sh@164 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:12.597 04:23:51 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:12.597 04:23:51 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:12.597 04:23:51 -- common/autotest_common.sh@10 -- # set +x 00:07:12.857 ************************************ 00:07:12.857 START TEST spdkcli_tcp 00:07:12.857 ************************************ 00:07:12.857 04:23:51 spdkcli_tcp -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:12.857 * Looking for test storage... 00:07:12.857 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:07:12.857 04:23:51 spdkcli_tcp -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:12.857 04:23:51 spdkcli_tcp -- common/autotest_common.sh@1693 -- # lcov --version 00:07:12.857 04:23:51 spdkcli_tcp -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:12.857 04:23:51 spdkcli_tcp -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@336 -- # IFS=.-: 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@336 -- # read -ra ver1 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@337 -- # IFS=.-: 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@337 -- # read -ra ver2 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@338 -- # local 'op=<' 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@340 -- # ver1_l=2 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@341 -- # ver2_l=1 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@344 -- # case "$op" in 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@345 -- # : 1 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@365 -- # decimal 1 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@353 -- # local d=1 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@355 -- # echo 1 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@365 -- # ver1[v]=1 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@366 -- # decimal 2 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@353 -- # local d=2 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@355 -- # echo 2 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@366 -- # ver2[v]=2 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:12.857 04:23:51 spdkcli_tcp -- scripts/common.sh@368 -- # return 0 00:07:12.857 04:23:51 spdkcli_tcp -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:12.857 04:23:51 spdkcli_tcp -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:12.857 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.857 --rc genhtml_branch_coverage=1 00:07:12.857 --rc genhtml_function_coverage=1 00:07:12.857 --rc genhtml_legend=1 00:07:12.857 --rc geninfo_all_blocks=1 00:07:12.857 --rc geninfo_unexecuted_blocks=1 00:07:12.857 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.857 ' 00:07:12.857 04:23:51 spdkcli_tcp -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:12.857 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.857 --rc genhtml_branch_coverage=1 00:07:12.857 --rc genhtml_function_coverage=1 00:07:12.857 --rc genhtml_legend=1 00:07:12.857 --rc geninfo_all_blocks=1 00:07:12.857 --rc geninfo_unexecuted_blocks=1 00:07:12.857 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.857 ' 00:07:12.857 04:23:51 spdkcli_tcp -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:12.857 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.857 --rc genhtml_branch_coverage=1 00:07:12.857 --rc genhtml_function_coverage=1 00:07:12.857 --rc genhtml_legend=1 00:07:12.857 --rc geninfo_all_blocks=1 00:07:12.857 --rc geninfo_unexecuted_blocks=1 00:07:12.857 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.857 ' 00:07:12.857 04:23:51 spdkcli_tcp -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:12.857 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.857 --rc genhtml_branch_coverage=1 00:07:12.857 --rc genhtml_function_coverage=1 00:07:12.857 --rc genhtml_legend=1 00:07:12.857 --rc geninfo_all_blocks=1 00:07:12.857 --rc geninfo_unexecuted_blocks=1 00:07:12.857 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.857 ' 00:07:12.857 04:23:51 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:07:12.857 04:23:51 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:07:12.857 04:23:51 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:07:12.857 04:23:51 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:07:12.857 04:23:51 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:07:12.857 04:23:51 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:07:12.857 04:23:51 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:07:12.857 04:23:51 spdkcli_tcp -- common/autotest_common.sh@726 -- # xtrace_disable 00:07:12.857 04:23:51 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:12.857 04:23:51 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=138615 00:07:12.857 04:23:51 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:07:12.857 04:23:51 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 138615 00:07:12.857 04:23:51 spdkcli_tcp -- common/autotest_common.sh@835 -- # '[' -z 138615 ']' 00:07:12.857 04:23:51 spdkcli_tcp -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:12.857 04:23:51 spdkcli_tcp -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:12.857 04:23:51 spdkcli_tcp -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:12.857 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:12.857 04:23:51 spdkcli_tcp -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:12.857 04:23:51 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:12.857 [2024-11-17 04:23:51.666262] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:12.857 [2024-11-17 04:23:51.666321] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid138615 ] 00:07:13.117 [2024-11-17 04:23:51.750611] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:13.117 [2024-11-17 04:23:51.774218] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:13.117 [2024-11-17 04:23:51.774219] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.377 04:23:51 spdkcli_tcp -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:13.377 04:23:51 spdkcli_tcp -- common/autotest_common.sh@868 -- # return 0 00:07:13.377 04:23:51 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=138620 00:07:13.377 04:23:51 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:07:13.377 04:23:51 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:07:13.377 [ 00:07:13.377 "spdk_get_version", 00:07:13.377 "rpc_get_methods", 00:07:13.377 "notify_get_notifications", 00:07:13.377 "notify_get_types", 00:07:13.377 "trace_get_info", 00:07:13.377 "trace_get_tpoint_group_mask", 00:07:13.377 "trace_disable_tpoint_group", 00:07:13.377 "trace_enable_tpoint_group", 00:07:13.377 "trace_clear_tpoint_mask", 00:07:13.377 "trace_set_tpoint_mask", 00:07:13.377 "fsdev_set_opts", 00:07:13.377 "fsdev_get_opts", 00:07:13.377 "framework_get_pci_devices", 00:07:13.377 "framework_get_config", 00:07:13.377 "framework_get_subsystems", 00:07:13.378 "vfu_tgt_set_base_path", 00:07:13.378 "keyring_get_keys", 00:07:13.378 "iobuf_get_stats", 00:07:13.378 "iobuf_set_options", 00:07:13.378 "sock_get_default_impl", 00:07:13.378 "sock_set_default_impl", 00:07:13.378 "sock_impl_set_options", 00:07:13.378 "sock_impl_get_options", 00:07:13.378 "vmd_rescan", 00:07:13.378 "vmd_remove_device", 00:07:13.378 "vmd_enable", 00:07:13.378 "accel_get_stats", 00:07:13.378 "accel_set_options", 00:07:13.378 "accel_set_driver", 00:07:13.378 "accel_crypto_key_destroy", 00:07:13.378 "accel_crypto_keys_get", 00:07:13.378 "accel_crypto_key_create", 00:07:13.378 "accel_assign_opc", 00:07:13.378 "accel_get_module_info", 00:07:13.378 "accel_get_opc_assignments", 00:07:13.378 "bdev_get_histogram", 00:07:13.378 "bdev_enable_histogram", 00:07:13.378 "bdev_set_qos_limit", 00:07:13.378 "bdev_set_qd_sampling_period", 00:07:13.378 "bdev_get_bdevs", 00:07:13.378 "bdev_reset_iostat", 00:07:13.378 "bdev_get_iostat", 00:07:13.378 "bdev_examine", 00:07:13.378 "bdev_wait_for_examine", 00:07:13.378 "bdev_set_options", 00:07:13.378 "scsi_get_devices", 00:07:13.378 "thread_set_cpumask", 00:07:13.378 "scheduler_set_options", 00:07:13.378 "framework_get_governor", 00:07:13.378 "framework_get_scheduler", 00:07:13.378 "framework_set_scheduler", 00:07:13.378 "framework_get_reactors", 00:07:13.378 "thread_get_io_channels", 00:07:13.378 "thread_get_pollers", 00:07:13.378 "thread_get_stats", 00:07:13.378 "framework_monitor_context_switch", 00:07:13.378 "spdk_kill_instance", 00:07:13.378 "log_enable_timestamps", 00:07:13.378 "log_get_flags", 00:07:13.378 "log_clear_flag", 00:07:13.378 "log_set_flag", 00:07:13.378 "log_get_level", 00:07:13.378 "log_set_level", 00:07:13.378 "log_get_print_level", 00:07:13.378 "log_set_print_level", 00:07:13.378 "framework_enable_cpumask_locks", 00:07:13.378 "framework_disable_cpumask_locks", 00:07:13.378 "framework_wait_init", 00:07:13.378 "framework_start_init", 00:07:13.378 "virtio_blk_create_transport", 00:07:13.378 "virtio_blk_get_transports", 00:07:13.378 "vhost_controller_set_coalescing", 00:07:13.378 "vhost_get_controllers", 00:07:13.378 "vhost_delete_controller", 00:07:13.378 "vhost_create_blk_controller", 00:07:13.378 "vhost_scsi_controller_remove_target", 00:07:13.378 "vhost_scsi_controller_add_target", 00:07:13.378 "vhost_start_scsi_controller", 00:07:13.378 "vhost_create_scsi_controller", 00:07:13.378 "ublk_recover_disk", 00:07:13.378 "ublk_get_disks", 00:07:13.378 "ublk_stop_disk", 00:07:13.378 "ublk_start_disk", 00:07:13.378 "ublk_destroy_target", 00:07:13.378 "ublk_create_target", 00:07:13.378 "nbd_get_disks", 00:07:13.378 "nbd_stop_disk", 00:07:13.378 "nbd_start_disk", 00:07:13.378 "env_dpdk_get_mem_stats", 00:07:13.378 "nvmf_stop_mdns_prr", 00:07:13.378 "nvmf_publish_mdns_prr", 00:07:13.378 "nvmf_subsystem_get_listeners", 00:07:13.378 "nvmf_subsystem_get_qpairs", 00:07:13.378 "nvmf_subsystem_get_controllers", 00:07:13.378 "nvmf_get_stats", 00:07:13.378 "nvmf_get_transports", 00:07:13.378 "nvmf_create_transport", 00:07:13.378 "nvmf_get_targets", 00:07:13.378 "nvmf_delete_target", 00:07:13.378 "nvmf_create_target", 00:07:13.378 "nvmf_subsystem_allow_any_host", 00:07:13.378 "nvmf_subsystem_set_keys", 00:07:13.378 "nvmf_subsystem_remove_host", 00:07:13.378 "nvmf_subsystem_add_host", 00:07:13.378 "nvmf_ns_remove_host", 00:07:13.378 "nvmf_ns_add_host", 00:07:13.378 "nvmf_subsystem_remove_ns", 00:07:13.378 "nvmf_subsystem_set_ns_ana_group", 00:07:13.378 "nvmf_subsystem_add_ns", 00:07:13.378 "nvmf_subsystem_listener_set_ana_state", 00:07:13.378 "nvmf_discovery_get_referrals", 00:07:13.378 "nvmf_discovery_remove_referral", 00:07:13.378 "nvmf_discovery_add_referral", 00:07:13.378 "nvmf_subsystem_remove_listener", 00:07:13.378 "nvmf_subsystem_add_listener", 00:07:13.378 "nvmf_delete_subsystem", 00:07:13.378 "nvmf_create_subsystem", 00:07:13.378 "nvmf_get_subsystems", 00:07:13.378 "nvmf_set_crdt", 00:07:13.378 "nvmf_set_config", 00:07:13.378 "nvmf_set_max_subsystems", 00:07:13.378 "iscsi_get_histogram", 00:07:13.378 "iscsi_enable_histogram", 00:07:13.378 "iscsi_set_options", 00:07:13.378 "iscsi_get_auth_groups", 00:07:13.378 "iscsi_auth_group_remove_secret", 00:07:13.378 "iscsi_auth_group_add_secret", 00:07:13.378 "iscsi_delete_auth_group", 00:07:13.378 "iscsi_create_auth_group", 00:07:13.378 "iscsi_set_discovery_auth", 00:07:13.378 "iscsi_get_options", 00:07:13.378 "iscsi_target_node_request_logout", 00:07:13.378 "iscsi_target_node_set_redirect", 00:07:13.378 "iscsi_target_node_set_auth", 00:07:13.378 "iscsi_target_node_add_lun", 00:07:13.378 "iscsi_get_stats", 00:07:13.378 "iscsi_get_connections", 00:07:13.378 "iscsi_portal_group_set_auth", 00:07:13.378 "iscsi_start_portal_group", 00:07:13.378 "iscsi_delete_portal_group", 00:07:13.378 "iscsi_create_portal_group", 00:07:13.378 "iscsi_get_portal_groups", 00:07:13.378 "iscsi_delete_target_node", 00:07:13.378 "iscsi_target_node_remove_pg_ig_maps", 00:07:13.378 "iscsi_target_node_add_pg_ig_maps", 00:07:13.378 "iscsi_create_target_node", 00:07:13.378 "iscsi_get_target_nodes", 00:07:13.378 "iscsi_delete_initiator_group", 00:07:13.378 "iscsi_initiator_group_remove_initiators", 00:07:13.378 "iscsi_initiator_group_add_initiators", 00:07:13.378 "iscsi_create_initiator_group", 00:07:13.378 "iscsi_get_initiator_groups", 00:07:13.378 "fsdev_aio_delete", 00:07:13.378 "fsdev_aio_create", 00:07:13.378 "keyring_linux_set_options", 00:07:13.378 "keyring_file_remove_key", 00:07:13.378 "keyring_file_add_key", 00:07:13.378 "vfu_virtio_create_fs_endpoint", 00:07:13.378 "vfu_virtio_create_scsi_endpoint", 00:07:13.378 "vfu_virtio_scsi_remove_target", 00:07:13.378 "vfu_virtio_scsi_add_target", 00:07:13.378 "vfu_virtio_create_blk_endpoint", 00:07:13.378 "vfu_virtio_delete_endpoint", 00:07:13.378 "iaa_scan_accel_module", 00:07:13.378 "dsa_scan_accel_module", 00:07:13.378 "ioat_scan_accel_module", 00:07:13.378 "accel_error_inject_error", 00:07:13.378 "bdev_iscsi_delete", 00:07:13.378 "bdev_iscsi_create", 00:07:13.378 "bdev_iscsi_set_options", 00:07:13.378 "bdev_virtio_attach_controller", 00:07:13.378 "bdev_virtio_scsi_get_devices", 00:07:13.378 "bdev_virtio_detach_controller", 00:07:13.378 "bdev_virtio_blk_set_hotplug", 00:07:13.378 "bdev_ftl_set_property", 00:07:13.378 "bdev_ftl_get_properties", 00:07:13.378 "bdev_ftl_get_stats", 00:07:13.378 "bdev_ftl_unmap", 00:07:13.378 "bdev_ftl_unload", 00:07:13.378 "bdev_ftl_delete", 00:07:13.378 "bdev_ftl_load", 00:07:13.378 "bdev_ftl_create", 00:07:13.378 "bdev_aio_delete", 00:07:13.378 "bdev_aio_rescan", 00:07:13.378 "bdev_aio_create", 00:07:13.378 "blobfs_create", 00:07:13.378 "blobfs_detect", 00:07:13.378 "blobfs_set_cache_size", 00:07:13.378 "bdev_zone_block_delete", 00:07:13.378 "bdev_zone_block_create", 00:07:13.378 "bdev_delay_delete", 00:07:13.378 "bdev_delay_create", 00:07:13.378 "bdev_delay_update_latency", 00:07:13.378 "bdev_split_delete", 00:07:13.378 "bdev_split_create", 00:07:13.378 "bdev_error_inject_error", 00:07:13.378 "bdev_error_delete", 00:07:13.378 "bdev_error_create", 00:07:13.378 "bdev_raid_set_options", 00:07:13.378 "bdev_raid_remove_base_bdev", 00:07:13.378 "bdev_raid_add_base_bdev", 00:07:13.378 "bdev_raid_delete", 00:07:13.378 "bdev_raid_create", 00:07:13.378 "bdev_raid_get_bdevs", 00:07:13.378 "bdev_lvol_set_parent_bdev", 00:07:13.378 "bdev_lvol_set_parent", 00:07:13.378 "bdev_lvol_check_shallow_copy", 00:07:13.378 "bdev_lvol_start_shallow_copy", 00:07:13.378 "bdev_lvol_grow_lvstore", 00:07:13.378 "bdev_lvol_get_lvols", 00:07:13.378 "bdev_lvol_get_lvstores", 00:07:13.378 "bdev_lvol_delete", 00:07:13.378 "bdev_lvol_set_read_only", 00:07:13.378 "bdev_lvol_resize", 00:07:13.378 "bdev_lvol_decouple_parent", 00:07:13.378 "bdev_lvol_inflate", 00:07:13.378 "bdev_lvol_rename", 00:07:13.378 "bdev_lvol_clone_bdev", 00:07:13.378 "bdev_lvol_clone", 00:07:13.378 "bdev_lvol_snapshot", 00:07:13.378 "bdev_lvol_create", 00:07:13.378 "bdev_lvol_delete_lvstore", 00:07:13.378 "bdev_lvol_rename_lvstore", 00:07:13.378 "bdev_lvol_create_lvstore", 00:07:13.378 "bdev_passthru_delete", 00:07:13.378 "bdev_passthru_create", 00:07:13.378 "bdev_nvme_cuse_unregister", 00:07:13.378 "bdev_nvme_cuse_register", 00:07:13.378 "bdev_opal_new_user", 00:07:13.378 "bdev_opal_set_lock_state", 00:07:13.378 "bdev_opal_delete", 00:07:13.378 "bdev_opal_get_info", 00:07:13.379 "bdev_opal_create", 00:07:13.379 "bdev_nvme_opal_revert", 00:07:13.379 "bdev_nvme_opal_init", 00:07:13.379 "bdev_nvme_send_cmd", 00:07:13.379 "bdev_nvme_set_keys", 00:07:13.379 "bdev_nvme_get_path_iostat", 00:07:13.379 "bdev_nvme_get_mdns_discovery_info", 00:07:13.379 "bdev_nvme_stop_mdns_discovery", 00:07:13.379 "bdev_nvme_start_mdns_discovery", 00:07:13.379 "bdev_nvme_set_multipath_policy", 00:07:13.379 "bdev_nvme_set_preferred_path", 00:07:13.379 "bdev_nvme_get_io_paths", 00:07:13.379 "bdev_nvme_remove_error_injection", 00:07:13.379 "bdev_nvme_add_error_injection", 00:07:13.379 "bdev_nvme_get_discovery_info", 00:07:13.379 "bdev_nvme_stop_discovery", 00:07:13.379 "bdev_nvme_start_discovery", 00:07:13.379 "bdev_nvme_get_controller_health_info", 00:07:13.379 "bdev_nvme_disable_controller", 00:07:13.379 "bdev_nvme_enable_controller", 00:07:13.379 "bdev_nvme_reset_controller", 00:07:13.379 "bdev_nvme_get_transport_statistics", 00:07:13.379 "bdev_nvme_apply_firmware", 00:07:13.379 "bdev_nvme_detach_controller", 00:07:13.379 "bdev_nvme_get_controllers", 00:07:13.379 "bdev_nvme_attach_controller", 00:07:13.379 "bdev_nvme_set_hotplug", 00:07:13.379 "bdev_nvme_set_options", 00:07:13.379 "bdev_null_resize", 00:07:13.379 "bdev_null_delete", 00:07:13.379 "bdev_null_create", 00:07:13.379 "bdev_malloc_delete", 00:07:13.379 "bdev_malloc_create" 00:07:13.379 ] 00:07:13.379 04:23:52 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:07:13.379 04:23:52 spdkcli_tcp -- common/autotest_common.sh@732 -- # xtrace_disable 00:07:13.379 04:23:52 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:13.379 04:23:52 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:07:13.379 04:23:52 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 138615 00:07:13.379 04:23:52 spdkcli_tcp -- common/autotest_common.sh@954 -- # '[' -z 138615 ']' 00:07:13.379 04:23:52 spdkcli_tcp -- common/autotest_common.sh@958 -- # kill -0 138615 00:07:13.379 04:23:52 spdkcli_tcp -- common/autotest_common.sh@959 -- # uname 00:07:13.379 04:23:52 spdkcli_tcp -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:13.379 04:23:52 spdkcli_tcp -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 138615 00:07:13.639 04:23:52 spdkcli_tcp -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:13.639 04:23:52 spdkcli_tcp -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:13.639 04:23:52 spdkcli_tcp -- common/autotest_common.sh@972 -- # echo 'killing process with pid 138615' 00:07:13.639 killing process with pid 138615 00:07:13.639 04:23:52 spdkcli_tcp -- common/autotest_common.sh@973 -- # kill 138615 00:07:13.639 04:23:52 spdkcli_tcp -- common/autotest_common.sh@978 -- # wait 138615 00:07:13.899 00:07:13.899 real 0m1.104s 00:07:13.899 user 0m1.797s 00:07:13.899 sys 0m0.533s 00:07:13.899 04:23:52 spdkcli_tcp -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:13.899 04:23:52 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:13.899 ************************************ 00:07:13.899 END TEST spdkcli_tcp 00:07:13.899 ************************************ 00:07:13.899 04:23:52 -- spdk/autotest.sh@167 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:13.899 04:23:52 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:13.899 04:23:52 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:13.899 04:23:52 -- common/autotest_common.sh@10 -- # set +x 00:07:13.899 ************************************ 00:07:13.899 START TEST dpdk_mem_utility 00:07:13.899 ************************************ 00:07:13.899 04:23:52 dpdk_mem_utility -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:13.899 * Looking for test storage... 00:07:14.159 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:07:14.159 04:23:52 dpdk_mem_utility -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:14.159 04:23:52 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # lcov --version 00:07:14.159 04:23:52 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:14.159 04:23:52 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@336 -- # IFS=.-: 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@336 -- # read -ra ver1 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@337 -- # IFS=.-: 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@337 -- # read -ra ver2 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@338 -- # local 'op=<' 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@340 -- # ver1_l=2 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@341 -- # ver2_l=1 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@344 -- # case "$op" in 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@345 -- # : 1 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@365 -- # decimal 1 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=1 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 1 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@365 -- # ver1[v]=1 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@366 -- # decimal 2 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=2 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 2 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@366 -- # ver2[v]=2 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:14.159 04:23:52 dpdk_mem_utility -- scripts/common.sh@368 -- # return 0 00:07:14.159 04:23:52 dpdk_mem_utility -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:14.159 04:23:52 dpdk_mem_utility -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:14.160 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:14.160 --rc genhtml_branch_coverage=1 00:07:14.160 --rc genhtml_function_coverage=1 00:07:14.160 --rc genhtml_legend=1 00:07:14.160 --rc geninfo_all_blocks=1 00:07:14.160 --rc geninfo_unexecuted_blocks=1 00:07:14.160 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:14.160 ' 00:07:14.160 04:23:52 dpdk_mem_utility -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:14.160 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:14.160 --rc genhtml_branch_coverage=1 00:07:14.160 --rc genhtml_function_coverage=1 00:07:14.160 --rc genhtml_legend=1 00:07:14.160 --rc geninfo_all_blocks=1 00:07:14.160 --rc geninfo_unexecuted_blocks=1 00:07:14.160 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:14.160 ' 00:07:14.160 04:23:52 dpdk_mem_utility -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:14.160 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:14.160 --rc genhtml_branch_coverage=1 00:07:14.160 --rc genhtml_function_coverage=1 00:07:14.160 --rc genhtml_legend=1 00:07:14.160 --rc geninfo_all_blocks=1 00:07:14.160 --rc geninfo_unexecuted_blocks=1 00:07:14.160 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:14.160 ' 00:07:14.160 04:23:52 dpdk_mem_utility -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:14.160 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:14.160 --rc genhtml_branch_coverage=1 00:07:14.160 --rc genhtml_function_coverage=1 00:07:14.160 --rc genhtml_legend=1 00:07:14.160 --rc geninfo_all_blocks=1 00:07:14.160 --rc geninfo_unexecuted_blocks=1 00:07:14.160 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:14.160 ' 00:07:14.160 04:23:52 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:14.160 04:23:52 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=138904 00:07:14.160 04:23:52 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:14.160 04:23:52 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 138904 00:07:14.160 04:23:52 dpdk_mem_utility -- common/autotest_common.sh@835 -- # '[' -z 138904 ']' 00:07:14.160 04:23:52 dpdk_mem_utility -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:14.160 04:23:52 dpdk_mem_utility -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:14.160 04:23:52 dpdk_mem_utility -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:14.160 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:14.160 04:23:52 dpdk_mem_utility -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:14.160 04:23:52 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:14.160 [2024-11-17 04:23:52.845671] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:14.160 [2024-11-17 04:23:52.845763] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid138904 ] 00:07:14.160 [2024-11-17 04:23:52.913410] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.160 [2024-11-17 04:23:52.935412] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.420 04:23:53 dpdk_mem_utility -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:14.420 04:23:53 dpdk_mem_utility -- common/autotest_common.sh@868 -- # return 0 00:07:14.420 04:23:53 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:07:14.420 04:23:53 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:07:14.420 04:23:53 dpdk_mem_utility -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:14.420 04:23:53 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:14.420 { 00:07:14.421 "filename": "/tmp/spdk_mem_dump.txt" 00:07:14.421 } 00:07:14.421 04:23:53 dpdk_mem_utility -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:14.421 04:23:53 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:14.421 DPDK memory size 810.000000 MiB in 1 heap(s) 00:07:14.421 1 heaps totaling size 810.000000 MiB 00:07:14.421 size: 810.000000 MiB heap id: 0 00:07:14.421 end heaps---------- 00:07:14.421 9 mempools totaling size 595.772034 MiB 00:07:14.421 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:07:14.421 size: 158.602051 MiB name: PDU_data_out_Pool 00:07:14.421 size: 92.545471 MiB name: bdev_io_138904 00:07:14.421 size: 50.003479 MiB name: msgpool_138904 00:07:14.421 size: 36.509338 MiB name: fsdev_io_138904 00:07:14.421 size: 21.763794 MiB name: PDU_Pool 00:07:14.421 size: 19.513306 MiB name: SCSI_TASK_Pool 00:07:14.421 size: 4.133484 MiB name: evtpool_138904 00:07:14.421 size: 0.026123 MiB name: Session_Pool 00:07:14.421 end mempools------- 00:07:14.421 6 memzones totaling size 4.142822 MiB 00:07:14.421 size: 1.000366 MiB name: RG_ring_0_138904 00:07:14.421 size: 1.000366 MiB name: RG_ring_1_138904 00:07:14.421 size: 1.000366 MiB name: RG_ring_4_138904 00:07:14.421 size: 1.000366 MiB name: RG_ring_5_138904 00:07:14.421 size: 0.125366 MiB name: RG_ring_2_138904 00:07:14.421 size: 0.015991 MiB name: RG_ring_3_138904 00:07:14.421 end memzones------- 00:07:14.421 04:23:53 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:07:14.421 heap id: 0 total size: 810.000000 MiB number of busy elements: 44 number of free elements: 15 00:07:14.421 list of free elements. size: 10.862488 MiB 00:07:14.421 element at address: 0x200018a00000 with size: 0.999878 MiB 00:07:14.421 element at address: 0x200018c00000 with size: 0.999878 MiB 00:07:14.421 element at address: 0x200000400000 with size: 0.998535 MiB 00:07:14.421 element at address: 0x200031800000 with size: 0.994446 MiB 00:07:14.421 element at address: 0x200008000000 with size: 0.959839 MiB 00:07:14.421 element at address: 0x200012c00000 with size: 0.954285 MiB 00:07:14.421 element at address: 0x200018e00000 with size: 0.936584 MiB 00:07:14.421 element at address: 0x200000200000 with size: 0.717346 MiB 00:07:14.421 element at address: 0x20001a600000 with size: 0.582886 MiB 00:07:14.421 element at address: 0x200000c00000 with size: 0.495422 MiB 00:07:14.421 element at address: 0x200003e00000 with size: 0.490723 MiB 00:07:14.421 element at address: 0x200019000000 with size: 0.485657 MiB 00:07:14.421 element at address: 0x200010600000 with size: 0.481934 MiB 00:07:14.421 element at address: 0x200027a00000 with size: 0.410034 MiB 00:07:14.421 element at address: 0x200000800000 with size: 0.355042 MiB 00:07:14.421 list of standard malloc elements. size: 199.218628 MiB 00:07:14.421 element at address: 0x2000081fff80 with size: 132.000122 MiB 00:07:14.421 element at address: 0x200003ffff80 with size: 64.000122 MiB 00:07:14.421 element at address: 0x200018afff80 with size: 1.000122 MiB 00:07:14.421 element at address: 0x200018cfff80 with size: 1.000122 MiB 00:07:14.421 element at address: 0x200018efff80 with size: 1.000122 MiB 00:07:14.421 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:07:14.421 element at address: 0x200018eeff00 with size: 0.062622 MiB 00:07:14.421 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:07:14.421 element at address: 0x200018eefdc0 with size: 0.000305 MiB 00:07:14.421 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:07:14.421 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:07:14.421 element at address: 0x2000004ffa00 with size: 0.000183 MiB 00:07:14.421 element at address: 0x2000004ffac0 with size: 0.000183 MiB 00:07:14.421 element at address: 0x2000004ffb80 with size: 0.000183 MiB 00:07:14.421 element at address: 0x2000004ffd80 with size: 0.000183 MiB 00:07:14.421 element at address: 0x2000004ffe40 with size: 0.000183 MiB 00:07:14.421 element at address: 0x20000085ae40 with size: 0.000183 MiB 00:07:14.421 element at address: 0x20000085b040 with size: 0.000183 MiB 00:07:14.421 element at address: 0x20000085b100 with size: 0.000183 MiB 00:07:14.421 element at address: 0x2000008db3c0 with size: 0.000183 MiB 00:07:14.421 element at address: 0x2000008db5c0 with size: 0.000183 MiB 00:07:14.421 element at address: 0x2000008df880 with size: 0.000183 MiB 00:07:14.421 element at address: 0x2000008ffb40 with size: 0.000183 MiB 00:07:14.421 element at address: 0x200000c7ed40 with size: 0.000183 MiB 00:07:14.421 element at address: 0x200000cff000 with size: 0.000183 MiB 00:07:14.421 element at address: 0x200000cff0c0 with size: 0.000183 MiB 00:07:14.421 element at address: 0x200003e7da00 with size: 0.000183 MiB 00:07:14.421 element at address: 0x200003e7dac0 with size: 0.000183 MiB 00:07:14.421 element at address: 0x200003efdd80 with size: 0.000183 MiB 00:07:14.421 element at address: 0x2000080fdd80 with size: 0.000183 MiB 00:07:14.421 element at address: 0x20001067b600 with size: 0.000183 MiB 00:07:14.421 element at address: 0x20001067b6c0 with size: 0.000183 MiB 00:07:14.421 element at address: 0x2000106fb980 with size: 0.000183 MiB 00:07:14.421 element at address: 0x200012cf44c0 with size: 0.000183 MiB 00:07:14.421 element at address: 0x200018eefc40 with size: 0.000183 MiB 00:07:14.421 element at address: 0x200018eefd00 with size: 0.000183 MiB 00:07:14.421 element at address: 0x2000190bc740 with size: 0.000183 MiB 00:07:14.421 element at address: 0x20001a695380 with size: 0.000183 MiB 00:07:14.421 element at address: 0x20001a695440 with size: 0.000183 MiB 00:07:14.421 element at address: 0x200027a68f80 with size: 0.000183 MiB 00:07:14.421 element at address: 0x200027a69040 with size: 0.000183 MiB 00:07:14.421 element at address: 0x200027a6fc40 with size: 0.000183 MiB 00:07:14.421 element at address: 0x200027a6fe40 with size: 0.000183 MiB 00:07:14.421 element at address: 0x200027a6ff00 with size: 0.000183 MiB 00:07:14.421 list of memzone associated elements. size: 599.918884 MiB 00:07:14.421 element at address: 0x20001a695500 with size: 211.416748 MiB 00:07:14.421 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:07:14.421 element at address: 0x200027a6ffc0 with size: 157.562561 MiB 00:07:14.421 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:07:14.421 element at address: 0x200012df4780 with size: 92.045044 MiB 00:07:14.421 associated memzone info: size: 92.044922 MiB name: MP_bdev_io_138904_0 00:07:14.421 element at address: 0x200000dff380 with size: 48.003052 MiB 00:07:14.421 associated memzone info: size: 48.002930 MiB name: MP_msgpool_138904_0 00:07:14.421 element at address: 0x2000107fdb80 with size: 36.008911 MiB 00:07:14.421 associated memzone info: size: 36.008789 MiB name: MP_fsdev_io_138904_0 00:07:14.421 element at address: 0x2000191be940 with size: 20.255554 MiB 00:07:14.421 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:07:14.421 element at address: 0x2000319feb40 with size: 18.005066 MiB 00:07:14.421 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:07:14.421 element at address: 0x2000004fff00 with size: 3.000244 MiB 00:07:14.421 associated memzone info: size: 3.000122 MiB name: MP_evtpool_138904_0 00:07:14.421 element at address: 0x2000009ffe00 with size: 2.000488 MiB 00:07:14.421 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_138904 00:07:14.421 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:07:14.421 associated memzone info: size: 1.007996 MiB name: MP_evtpool_138904 00:07:14.421 element at address: 0x2000106fba40 with size: 1.008118 MiB 00:07:14.421 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:07:14.421 element at address: 0x2000190bc800 with size: 1.008118 MiB 00:07:14.421 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:07:14.421 element at address: 0x2000080fde40 with size: 1.008118 MiB 00:07:14.421 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:07:14.421 element at address: 0x200003efde40 with size: 1.008118 MiB 00:07:14.421 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:07:14.421 element at address: 0x200000cff180 with size: 1.000488 MiB 00:07:14.421 associated memzone info: size: 1.000366 MiB name: RG_ring_0_138904 00:07:14.421 element at address: 0x2000008ffc00 with size: 1.000488 MiB 00:07:14.421 associated memzone info: size: 1.000366 MiB name: RG_ring_1_138904 00:07:14.421 element at address: 0x200012cf4580 with size: 1.000488 MiB 00:07:14.421 associated memzone info: size: 1.000366 MiB name: RG_ring_4_138904 00:07:14.421 element at address: 0x2000318fe940 with size: 1.000488 MiB 00:07:14.421 associated memzone info: size: 1.000366 MiB name: RG_ring_5_138904 00:07:14.421 element at address: 0x20000085b1c0 with size: 0.500488 MiB 00:07:14.421 associated memzone info: size: 0.500366 MiB name: RG_MP_fsdev_io_138904 00:07:14.421 element at address: 0x200000c7ee00 with size: 0.500488 MiB 00:07:14.421 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_138904 00:07:14.421 element at address: 0x20001067b780 with size: 0.500488 MiB 00:07:14.421 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:07:14.421 element at address: 0x200003e7db80 with size: 0.500488 MiB 00:07:14.421 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:07:14.421 element at address: 0x20001907c540 with size: 0.250488 MiB 00:07:14.421 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:07:14.421 element at address: 0x2000002b7a40 with size: 0.125488 MiB 00:07:14.421 associated memzone info: size: 0.125366 MiB name: RG_MP_evtpool_138904 00:07:14.421 element at address: 0x2000008df940 with size: 0.125488 MiB 00:07:14.421 associated memzone info: size: 0.125366 MiB name: RG_ring_2_138904 00:07:14.421 element at address: 0x2000080f5b80 with size: 0.031738 MiB 00:07:14.421 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:07:14.421 element at address: 0x200027a69100 with size: 0.023743 MiB 00:07:14.421 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:07:14.421 element at address: 0x2000008db680 with size: 0.016113 MiB 00:07:14.421 associated memzone info: size: 0.015991 MiB name: RG_ring_3_138904 00:07:14.421 element at address: 0x200027a6f240 with size: 0.002441 MiB 00:07:14.421 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:07:14.421 element at address: 0x2000004ffc40 with size: 0.000305 MiB 00:07:14.421 associated memzone info: size: 0.000183 MiB name: MP_msgpool_138904 00:07:14.421 element at address: 0x2000008db480 with size: 0.000305 MiB 00:07:14.422 associated memzone info: size: 0.000183 MiB name: MP_fsdev_io_138904 00:07:14.422 element at address: 0x20000085af00 with size: 0.000305 MiB 00:07:14.422 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_138904 00:07:14.422 element at address: 0x200027a6fd00 with size: 0.000305 MiB 00:07:14.422 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:07:14.681 04:23:53 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:07:14.681 04:23:53 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 138904 00:07:14.681 04:23:53 dpdk_mem_utility -- common/autotest_common.sh@954 -- # '[' -z 138904 ']' 00:07:14.681 04:23:53 dpdk_mem_utility -- common/autotest_common.sh@958 -- # kill -0 138904 00:07:14.681 04:23:53 dpdk_mem_utility -- common/autotest_common.sh@959 -- # uname 00:07:14.681 04:23:53 dpdk_mem_utility -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:14.681 04:23:53 dpdk_mem_utility -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 138904 00:07:14.681 04:23:53 dpdk_mem_utility -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:14.681 04:23:53 dpdk_mem_utility -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:14.681 04:23:53 dpdk_mem_utility -- common/autotest_common.sh@972 -- # echo 'killing process with pid 138904' 00:07:14.681 killing process with pid 138904 00:07:14.681 04:23:53 dpdk_mem_utility -- common/autotest_common.sh@973 -- # kill 138904 00:07:14.681 04:23:53 dpdk_mem_utility -- common/autotest_common.sh@978 -- # wait 138904 00:07:14.941 00:07:14.941 real 0m0.974s 00:07:14.941 user 0m0.914s 00:07:14.941 sys 0m0.449s 00:07:14.941 04:23:53 dpdk_mem_utility -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:14.941 04:23:53 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:14.941 ************************************ 00:07:14.941 END TEST dpdk_mem_utility 00:07:14.941 ************************************ 00:07:14.941 04:23:53 -- spdk/autotest.sh@168 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:07:14.941 04:23:53 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:14.941 04:23:53 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:14.941 04:23:53 -- common/autotest_common.sh@10 -- # set +x 00:07:14.941 ************************************ 00:07:14.941 START TEST event 00:07:14.941 ************************************ 00:07:14.941 04:23:53 event -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:07:15.201 * Looking for test storage... 00:07:15.201 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:15.201 04:23:53 event -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:15.201 04:23:53 event -- common/autotest_common.sh@1693 -- # lcov --version 00:07:15.201 04:23:53 event -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:15.201 04:23:53 event -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:15.201 04:23:53 event -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:15.201 04:23:53 event -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:15.201 04:23:53 event -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:15.201 04:23:53 event -- scripts/common.sh@336 -- # IFS=.-: 00:07:15.201 04:23:53 event -- scripts/common.sh@336 -- # read -ra ver1 00:07:15.201 04:23:53 event -- scripts/common.sh@337 -- # IFS=.-: 00:07:15.201 04:23:53 event -- scripts/common.sh@337 -- # read -ra ver2 00:07:15.201 04:23:53 event -- scripts/common.sh@338 -- # local 'op=<' 00:07:15.201 04:23:53 event -- scripts/common.sh@340 -- # ver1_l=2 00:07:15.201 04:23:53 event -- scripts/common.sh@341 -- # ver2_l=1 00:07:15.201 04:23:53 event -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:15.201 04:23:53 event -- scripts/common.sh@344 -- # case "$op" in 00:07:15.201 04:23:53 event -- scripts/common.sh@345 -- # : 1 00:07:15.201 04:23:53 event -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:15.202 04:23:53 event -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:15.202 04:23:53 event -- scripts/common.sh@365 -- # decimal 1 00:07:15.202 04:23:53 event -- scripts/common.sh@353 -- # local d=1 00:07:15.202 04:23:53 event -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:15.202 04:23:53 event -- scripts/common.sh@355 -- # echo 1 00:07:15.202 04:23:53 event -- scripts/common.sh@365 -- # ver1[v]=1 00:07:15.202 04:23:53 event -- scripts/common.sh@366 -- # decimal 2 00:07:15.202 04:23:53 event -- scripts/common.sh@353 -- # local d=2 00:07:15.202 04:23:53 event -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:15.202 04:23:53 event -- scripts/common.sh@355 -- # echo 2 00:07:15.202 04:23:53 event -- scripts/common.sh@366 -- # ver2[v]=2 00:07:15.202 04:23:53 event -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:15.202 04:23:53 event -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:15.202 04:23:53 event -- scripts/common.sh@368 -- # return 0 00:07:15.202 04:23:53 event -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:15.202 04:23:53 event -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:15.202 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:15.202 --rc genhtml_branch_coverage=1 00:07:15.202 --rc genhtml_function_coverage=1 00:07:15.202 --rc genhtml_legend=1 00:07:15.202 --rc geninfo_all_blocks=1 00:07:15.202 --rc geninfo_unexecuted_blocks=1 00:07:15.202 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:15.202 ' 00:07:15.202 04:23:53 event -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:15.202 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:15.202 --rc genhtml_branch_coverage=1 00:07:15.202 --rc genhtml_function_coverage=1 00:07:15.202 --rc genhtml_legend=1 00:07:15.202 --rc geninfo_all_blocks=1 00:07:15.202 --rc geninfo_unexecuted_blocks=1 00:07:15.202 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:15.202 ' 00:07:15.202 04:23:53 event -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:15.202 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:15.202 --rc genhtml_branch_coverage=1 00:07:15.202 --rc genhtml_function_coverage=1 00:07:15.202 --rc genhtml_legend=1 00:07:15.202 --rc geninfo_all_blocks=1 00:07:15.202 --rc geninfo_unexecuted_blocks=1 00:07:15.202 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:15.202 ' 00:07:15.202 04:23:53 event -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:15.202 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:15.202 --rc genhtml_branch_coverage=1 00:07:15.202 --rc genhtml_function_coverage=1 00:07:15.202 --rc genhtml_legend=1 00:07:15.202 --rc geninfo_all_blocks=1 00:07:15.202 --rc geninfo_unexecuted_blocks=1 00:07:15.202 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:15.202 ' 00:07:15.202 04:23:53 event -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:15.202 04:23:53 event -- bdev/nbd_common.sh@6 -- # set -e 00:07:15.202 04:23:53 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:15.202 04:23:53 event -- common/autotest_common.sh@1105 -- # '[' 6 -le 1 ']' 00:07:15.202 04:23:53 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:15.202 04:23:53 event -- common/autotest_common.sh@10 -- # set +x 00:07:15.202 ************************************ 00:07:15.202 START TEST event_perf 00:07:15.202 ************************************ 00:07:15.202 04:23:53 event.event_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:15.202 Running I/O for 1 seconds...[2024-11-17 04:23:53.935369] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:15.202 [2024-11-17 04:23:53.935449] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid139032 ] 00:07:15.202 [2024-11-17 04:23:54.026569] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:15.462 [2024-11-17 04:23:54.052828] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:15.462 [2024-11-17 04:23:54.052939] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:15.462 [2024-11-17 04:23:54.053049] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.462 [2024-11-17 04:23:54.053050] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:16.402 Running I/O for 1 seconds... 00:07:16.402 lcore 0: 195888 00:07:16.402 lcore 1: 195886 00:07:16.402 lcore 2: 195886 00:07:16.402 lcore 3: 195887 00:07:16.403 done. 00:07:16.403 00:07:16.403 real 0m1.168s 00:07:16.403 user 0m4.067s 00:07:16.403 sys 0m0.097s 00:07:16.403 04:23:55 event.event_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:16.403 04:23:55 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:07:16.403 ************************************ 00:07:16.403 END TEST event_perf 00:07:16.403 ************************************ 00:07:16.403 04:23:55 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:16.403 04:23:55 event -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:07:16.403 04:23:55 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:16.403 04:23:55 event -- common/autotest_common.sh@10 -- # set +x 00:07:16.403 ************************************ 00:07:16.403 START TEST event_reactor 00:07:16.403 ************************************ 00:07:16.403 04:23:55 event.event_reactor -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:16.403 [2024-11-17 04:23:55.190722] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:16.403 [2024-11-17 04:23:55.190807] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid139313 ] 00:07:16.663 [2024-11-17 04:23:55.280990] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.663 [2024-11-17 04:23:55.304275] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.602 test_start 00:07:17.602 oneshot 00:07:17.602 tick 100 00:07:17.602 tick 100 00:07:17.602 tick 250 00:07:17.602 tick 100 00:07:17.602 tick 100 00:07:17.602 tick 100 00:07:17.602 tick 250 00:07:17.602 tick 500 00:07:17.602 tick 100 00:07:17.602 tick 100 00:07:17.602 tick 250 00:07:17.602 tick 100 00:07:17.602 tick 100 00:07:17.602 test_end 00:07:17.602 00:07:17.602 real 0m1.161s 00:07:17.602 user 0m1.068s 00:07:17.602 sys 0m0.088s 00:07:17.602 04:23:56 event.event_reactor -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:17.602 04:23:56 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:07:17.602 ************************************ 00:07:17.602 END TEST event_reactor 00:07:17.602 ************************************ 00:07:17.602 04:23:56 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:17.602 04:23:56 event -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:07:17.602 04:23:56 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:17.602 04:23:56 event -- common/autotest_common.sh@10 -- # set +x 00:07:17.602 ************************************ 00:07:17.602 START TEST event_reactor_perf 00:07:17.602 ************************************ 00:07:17.602 04:23:56 event.event_reactor_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:17.862 [2024-11-17 04:23:56.437260] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:17.862 [2024-11-17 04:23:56.437349] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid139601 ] 00:07:17.862 [2024-11-17 04:23:56.526950] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.862 [2024-11-17 04:23:56.551860] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.802 test_start 00:07:18.803 test_end 00:07:18.803 Performance: 954284 events per second 00:07:18.803 00:07:18.803 real 0m1.162s 00:07:18.803 user 0m1.060s 00:07:18.803 sys 0m0.097s 00:07:18.803 04:23:57 event.event_reactor_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:18.803 04:23:57 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:07:18.803 ************************************ 00:07:18.803 END TEST event_reactor_perf 00:07:18.803 ************************************ 00:07:18.803 04:23:57 event -- event/event.sh@49 -- # uname -s 00:07:18.803 04:23:57 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:07:18.803 04:23:57 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:18.803 04:23:57 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:19.063 04:23:57 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:19.063 04:23:57 event -- common/autotest_common.sh@10 -- # set +x 00:07:19.063 ************************************ 00:07:19.063 START TEST event_scheduler 00:07:19.063 ************************************ 00:07:19.063 04:23:57 event.event_scheduler -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:19.063 * Looking for test storage... 00:07:19.063 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:07:19.063 04:23:57 event.event_scheduler -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:19.063 04:23:57 event.event_scheduler -- common/autotest_common.sh@1693 -- # lcov --version 00:07:19.063 04:23:57 event.event_scheduler -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:19.063 04:23:57 event.event_scheduler -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@336 -- # IFS=.-: 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@336 -- # read -ra ver1 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@337 -- # IFS=.-: 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@337 -- # read -ra ver2 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@338 -- # local 'op=<' 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@340 -- # ver1_l=2 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@341 -- # ver2_l=1 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@344 -- # case "$op" in 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@345 -- # : 1 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@365 -- # decimal 1 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@353 -- # local d=1 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@355 -- # echo 1 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@365 -- # ver1[v]=1 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@366 -- # decimal 2 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@353 -- # local d=2 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@355 -- # echo 2 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@366 -- # ver2[v]=2 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:19.063 04:23:57 event.event_scheduler -- scripts/common.sh@368 -- # return 0 00:07:19.063 04:23:57 event.event_scheduler -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:19.063 04:23:57 event.event_scheduler -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:19.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:19.063 --rc genhtml_branch_coverage=1 00:07:19.063 --rc genhtml_function_coverage=1 00:07:19.063 --rc genhtml_legend=1 00:07:19.063 --rc geninfo_all_blocks=1 00:07:19.063 --rc geninfo_unexecuted_blocks=1 00:07:19.063 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:19.063 ' 00:07:19.063 04:23:57 event.event_scheduler -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:19.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:19.063 --rc genhtml_branch_coverage=1 00:07:19.063 --rc genhtml_function_coverage=1 00:07:19.063 --rc genhtml_legend=1 00:07:19.063 --rc geninfo_all_blocks=1 00:07:19.063 --rc geninfo_unexecuted_blocks=1 00:07:19.063 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:19.063 ' 00:07:19.063 04:23:57 event.event_scheduler -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:19.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:19.063 --rc genhtml_branch_coverage=1 00:07:19.063 --rc genhtml_function_coverage=1 00:07:19.063 --rc genhtml_legend=1 00:07:19.063 --rc geninfo_all_blocks=1 00:07:19.063 --rc geninfo_unexecuted_blocks=1 00:07:19.063 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:19.063 ' 00:07:19.063 04:23:57 event.event_scheduler -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:19.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:19.063 --rc genhtml_branch_coverage=1 00:07:19.063 --rc genhtml_function_coverage=1 00:07:19.063 --rc genhtml_legend=1 00:07:19.063 --rc geninfo_all_blocks=1 00:07:19.063 --rc geninfo_unexecuted_blocks=1 00:07:19.063 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:19.063 ' 00:07:19.063 04:23:57 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:19.063 04:23:57 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=139919 00:07:19.063 04:23:57 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:19.063 04:23:57 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:19.063 04:23:57 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 139919 00:07:19.063 04:23:57 event.event_scheduler -- common/autotest_common.sh@835 -- # '[' -z 139919 ']' 00:07:19.063 04:23:57 event.event_scheduler -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:19.063 04:23:57 event.event_scheduler -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:19.063 04:23:57 event.event_scheduler -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:19.063 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:19.063 04:23:57 event.event_scheduler -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:19.063 04:23:57 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:19.323 [2024-11-17 04:23:57.896164] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:19.323 [2024-11-17 04:23:57.896251] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid139919 ] 00:07:19.323 [2024-11-17 04:23:57.988449] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:19.323 [2024-11-17 04:23:58.016572] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.323 [2024-11-17 04:23:58.016602] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:19.323 [2024-11-17 04:23:58.016725] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:19.323 [2024-11-17 04:23:58.016726] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:19.323 04:23:58 event.event_scheduler -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:19.323 04:23:58 event.event_scheduler -- common/autotest_common.sh@868 -- # return 0 00:07:19.323 04:23:58 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:19.323 04:23:58 event.event_scheduler -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:19.323 04:23:58 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:19.323 [2024-11-17 04:23:58.077509] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:07:19.323 [2024-11-17 04:23:58.077530] scheduler_dynamic.c: 280:init: *NOTICE*: Unable to initialize dpdk governor 00:07:19.323 [2024-11-17 04:23:58.077541] scheduler_dynamic.c: 427:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:19.323 [2024-11-17 04:23:58.077549] scheduler_dynamic.c: 429:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:19.323 [2024-11-17 04:23:58.077556] scheduler_dynamic.c: 431:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:19.323 04:23:58 event.event_scheduler -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:19.324 04:23:58 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:19.324 04:23:58 event.event_scheduler -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:19.324 04:23:58 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:19.324 [2024-11-17 04:23:58.145267] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:19.324 04:23:58 event.event_scheduler -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:19.324 04:23:58 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:19.324 04:23:58 event.event_scheduler -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:19.324 04:23:58 event.event_scheduler -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:19.324 04:23:58 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:19.588 ************************************ 00:07:19.588 START TEST scheduler_create_thread 00:07:19.588 ************************************ 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1129 -- # scheduler_create_thread 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.588 2 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.588 3 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.588 4 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.588 5 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.588 6 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.588 7 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.588 8 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.588 9 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:19.588 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.589 10 00:07:19.589 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:19.589 04:23:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:19.589 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:19.589 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.589 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:19.589 04:23:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:19.589 04:23:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:19.589 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:19.589 04:23:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:20.546 04:23:59 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:20.546 04:23:59 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:20.546 04:23:59 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:20.546 04:23:59 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:21.926 04:24:00 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:21.926 04:24:00 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:21.926 04:24:00 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:21.926 04:24:00 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:21.926 04:24:00 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:22.863 04:24:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:22.863 00:07:22.863 real 0m3.381s 00:07:22.863 user 0m0.024s 00:07:22.863 sys 0m0.007s 00:07:22.863 04:24:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:22.863 04:24:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:22.863 ************************************ 00:07:22.863 END TEST scheduler_create_thread 00:07:22.863 ************************************ 00:07:22.863 04:24:01 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:22.863 04:24:01 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 139919 00:07:22.863 04:24:01 event.event_scheduler -- common/autotest_common.sh@954 -- # '[' -z 139919 ']' 00:07:22.863 04:24:01 event.event_scheduler -- common/autotest_common.sh@958 -- # kill -0 139919 00:07:22.863 04:24:01 event.event_scheduler -- common/autotest_common.sh@959 -- # uname 00:07:22.863 04:24:01 event.event_scheduler -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:22.863 04:24:01 event.event_scheduler -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 139919 00:07:22.863 04:24:01 event.event_scheduler -- common/autotest_common.sh@960 -- # process_name=reactor_2 00:07:22.863 04:24:01 event.event_scheduler -- common/autotest_common.sh@964 -- # '[' reactor_2 = sudo ']' 00:07:22.863 04:24:01 event.event_scheduler -- common/autotest_common.sh@972 -- # echo 'killing process with pid 139919' 00:07:22.863 killing process with pid 139919 00:07:22.863 04:24:01 event.event_scheduler -- common/autotest_common.sh@973 -- # kill 139919 00:07:22.863 04:24:01 event.event_scheduler -- common/autotest_common.sh@978 -- # wait 139919 00:07:23.122 [2024-11-17 04:24:01.945442] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:23.382 00:07:23.382 real 0m4.466s 00:07:23.382 user 0m7.785s 00:07:23.382 sys 0m0.470s 00:07:23.382 04:24:02 event.event_scheduler -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:23.382 04:24:02 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:23.382 ************************************ 00:07:23.382 END TEST event_scheduler 00:07:23.382 ************************************ 00:07:23.382 04:24:02 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:23.382 04:24:02 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:23.382 04:24:02 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:23.382 04:24:02 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:23.382 04:24:02 event -- common/autotest_common.sh@10 -- # set +x 00:07:23.643 ************************************ 00:07:23.643 START TEST app_repeat 00:07:23.643 ************************************ 00:07:23.643 04:24:02 event.app_repeat -- common/autotest_common.sh@1129 -- # app_repeat_test 00:07:23.643 04:24:02 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:23.643 04:24:02 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:23.643 04:24:02 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:23.643 04:24:02 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:23.643 04:24:02 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:23.643 04:24:02 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:23.643 04:24:02 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:23.643 04:24:02 event.app_repeat -- event/event.sh@19 -- # repeat_pid=140889 00:07:23.643 04:24:02 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:23.643 04:24:02 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:23.643 04:24:02 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 140889' 00:07:23.643 Process app_repeat pid: 140889 00:07:23.643 04:24:02 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:23.643 04:24:02 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:23.643 spdk_app_start Round 0 00:07:23.643 04:24:02 event.app_repeat -- event/event.sh@25 -- # waitforlisten 140889 /var/tmp/spdk-nbd.sock 00:07:23.643 04:24:02 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 140889 ']' 00:07:23.643 04:24:02 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:23.643 04:24:02 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:23.643 04:24:02 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:23.643 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:23.643 04:24:02 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:23.643 04:24:02 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:23.643 [2024-11-17 04:24:02.262407] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:23.643 [2024-11-17 04:24:02.262494] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid140889 ] 00:07:23.643 [2024-11-17 04:24:02.334219] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:23.643 [2024-11-17 04:24:02.359163] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.643 [2024-11-17 04:24:02.359163] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:23.644 04:24:02 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:23.644 04:24:02 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:23.644 04:24:02 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:23.903 Malloc0 00:07:23.903 04:24:02 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:24.163 Malloc1 00:07:24.163 04:24:02 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:24.163 04:24:02 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.163 04:24:02 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:24.163 04:24:02 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:24.163 04:24:02 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.163 04:24:02 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:24.163 04:24:02 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:24.163 04:24:02 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.163 04:24:02 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:24.163 04:24:02 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:24.163 04:24:02 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.163 04:24:02 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:24.163 04:24:02 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:24.163 04:24:02 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:24.163 04:24:02 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:24.163 04:24:02 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:24.422 /dev/nbd0 00:07:24.423 04:24:03 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:24.423 04:24:03 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:24.423 04:24:03 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:07:24.423 04:24:03 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:24.423 04:24:03 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:24.423 04:24:03 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:24.423 04:24:03 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:07:24.423 04:24:03 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:24.423 04:24:03 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:24.423 04:24:03 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:24.423 04:24:03 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:24.423 1+0 records in 00:07:24.423 1+0 records out 00:07:24.423 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233203 s, 17.6 MB/s 00:07:24.423 04:24:03 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:24.423 04:24:03 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:24.423 04:24:03 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:24.423 04:24:03 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:24.423 04:24:03 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:24.423 04:24:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:24.423 04:24:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:24.423 04:24:03 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:24.682 /dev/nbd1 00:07:24.682 04:24:03 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:24.682 04:24:03 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:24.682 04:24:03 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:07:24.682 04:24:03 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:24.682 04:24:03 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:24.682 04:24:03 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:24.683 04:24:03 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:07:24.683 04:24:03 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:24.683 04:24:03 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:24.683 04:24:03 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:24.683 04:24:03 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:24.683 1+0 records in 00:07:24.683 1+0 records out 00:07:24.683 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000279134 s, 14.7 MB/s 00:07:24.683 04:24:03 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:24.683 04:24:03 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:24.683 04:24:03 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:24.683 04:24:03 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:24.683 04:24:03 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:24.683 04:24:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:24.683 04:24:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:24.683 04:24:03 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:24.683 04:24:03 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.683 04:24:03 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:24.942 04:24:03 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:24.942 { 00:07:24.942 "nbd_device": "/dev/nbd0", 00:07:24.942 "bdev_name": "Malloc0" 00:07:24.942 }, 00:07:24.942 { 00:07:24.942 "nbd_device": "/dev/nbd1", 00:07:24.942 "bdev_name": "Malloc1" 00:07:24.942 } 00:07:24.942 ]' 00:07:24.942 04:24:03 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:24.942 { 00:07:24.942 "nbd_device": "/dev/nbd0", 00:07:24.942 "bdev_name": "Malloc0" 00:07:24.942 }, 00:07:24.942 { 00:07:24.942 "nbd_device": "/dev/nbd1", 00:07:24.942 "bdev_name": "Malloc1" 00:07:24.942 } 00:07:24.942 ]' 00:07:24.942 04:24:03 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:24.942 04:24:03 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:24.942 /dev/nbd1' 00:07:24.942 04:24:03 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:24.942 /dev/nbd1' 00:07:24.942 04:24:03 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:24.942 04:24:03 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:24.942 04:24:03 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:24.942 04:24:03 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:24.942 04:24:03 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:24.942 04:24:03 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:24.942 04:24:03 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.942 04:24:03 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:24.942 04:24:03 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:24.942 04:24:03 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.942 04:24:03 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:24.942 04:24:03 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:24.942 256+0 records in 00:07:24.942 256+0 records out 00:07:24.942 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104872 s, 100 MB/s 00:07:24.942 04:24:03 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.942 04:24:03 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:24.942 256+0 records in 00:07:24.942 256+0 records out 00:07:24.942 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0175504 s, 59.7 MB/s 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:24.943 256+0 records in 00:07:24.943 256+0 records out 00:07:24.943 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0227242 s, 46.1 MB/s 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:24.943 04:24:03 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:25.202 04:24:03 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:25.202 04:24:03 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:25.202 04:24:03 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:25.202 04:24:03 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:25.202 04:24:03 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:25.202 04:24:03 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:25.202 04:24:03 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:25.202 04:24:03 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:25.202 04:24:03 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:25.202 04:24:03 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:25.461 04:24:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:25.461 04:24:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:25.461 04:24:04 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:25.461 04:24:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:25.461 04:24:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:25.462 04:24:04 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:25.462 04:24:04 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:25.462 04:24:04 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:25.462 04:24:04 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:25.462 04:24:04 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:25.462 04:24:04 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:25.721 04:24:04 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:25.721 04:24:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:25.721 04:24:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:25.721 04:24:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:25.721 04:24:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:25.721 04:24:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:25.721 04:24:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:25.721 04:24:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:25.721 04:24:04 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:25.721 04:24:04 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:25.721 04:24:04 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:25.721 04:24:04 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:25.721 04:24:04 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:25.981 04:24:04 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:25.981 [2024-11-17 04:24:04.773177] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:25.981 [2024-11-17 04:24:04.792778] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.981 [2024-11-17 04:24:04.792778] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:26.241 [2024-11-17 04:24:04.833980] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:26.241 [2024-11-17 04:24:04.834022] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:29.535 04:24:07 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:29.535 04:24:07 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:29.535 spdk_app_start Round 1 00:07:29.535 04:24:07 event.app_repeat -- event/event.sh@25 -- # waitforlisten 140889 /var/tmp/spdk-nbd.sock 00:07:29.535 04:24:07 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 140889 ']' 00:07:29.535 04:24:07 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:29.535 04:24:07 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:29.535 04:24:07 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:29.535 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:29.535 04:24:07 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:29.535 04:24:07 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:29.535 04:24:07 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:29.535 04:24:07 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:29.535 04:24:07 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:29.535 Malloc0 00:07:29.535 04:24:08 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:29.535 Malloc1 00:07:29.535 04:24:08 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:29.535 04:24:08 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:29.535 04:24:08 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:29.535 04:24:08 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:29.535 04:24:08 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:29.535 04:24:08 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:29.535 04:24:08 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:29.535 04:24:08 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:29.535 04:24:08 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:29.535 04:24:08 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:29.535 04:24:08 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:29.535 04:24:08 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:29.535 04:24:08 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:29.535 04:24:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:29.535 04:24:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:29.535 04:24:08 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:29.795 /dev/nbd0 00:07:29.795 04:24:08 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:29.795 04:24:08 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:29.795 04:24:08 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:07:29.795 04:24:08 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:29.795 04:24:08 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:29.795 04:24:08 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:29.795 04:24:08 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:07:29.795 04:24:08 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:29.795 04:24:08 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:29.795 04:24:08 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:29.795 04:24:08 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:29.795 1+0 records in 00:07:29.795 1+0 records out 00:07:29.795 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252963 s, 16.2 MB/s 00:07:29.795 04:24:08 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:29.795 04:24:08 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:29.795 04:24:08 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:29.795 04:24:08 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:29.795 04:24:08 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:29.795 04:24:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:29.795 04:24:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:29.796 04:24:08 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:30.055 /dev/nbd1 00:07:30.055 04:24:08 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:30.055 04:24:08 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:30.055 04:24:08 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:07:30.055 04:24:08 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:30.055 04:24:08 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:30.055 04:24:08 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:30.055 04:24:08 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:07:30.055 04:24:08 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:30.055 04:24:08 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:30.055 04:24:08 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:30.055 04:24:08 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:30.055 1+0 records in 00:07:30.055 1+0 records out 00:07:30.055 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000193752 s, 21.1 MB/s 00:07:30.055 04:24:08 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:30.055 04:24:08 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:30.055 04:24:08 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:30.055 04:24:08 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:30.055 04:24:08 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:30.055 04:24:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:30.055 04:24:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:30.055 04:24:08 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:30.055 04:24:08 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:30.055 04:24:08 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:30.315 04:24:08 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:30.315 { 00:07:30.315 "nbd_device": "/dev/nbd0", 00:07:30.315 "bdev_name": "Malloc0" 00:07:30.315 }, 00:07:30.315 { 00:07:30.315 "nbd_device": "/dev/nbd1", 00:07:30.315 "bdev_name": "Malloc1" 00:07:30.315 } 00:07:30.315 ]' 00:07:30.315 04:24:08 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:30.315 { 00:07:30.315 "nbd_device": "/dev/nbd0", 00:07:30.315 "bdev_name": "Malloc0" 00:07:30.315 }, 00:07:30.315 { 00:07:30.315 "nbd_device": "/dev/nbd1", 00:07:30.315 "bdev_name": "Malloc1" 00:07:30.315 } 00:07:30.315 ]' 00:07:30.315 04:24:08 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:30.315 04:24:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:30.315 /dev/nbd1' 00:07:30.315 04:24:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:30.315 /dev/nbd1' 00:07:30.315 04:24:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:30.315 04:24:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:30.315 04:24:09 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:30.315 04:24:09 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:30.315 04:24:09 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:30.315 04:24:09 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:30.315 04:24:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:30.315 04:24:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:30.315 04:24:09 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:30.315 04:24:09 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:30.315 04:24:09 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:30.315 04:24:09 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:30.315 256+0 records in 00:07:30.315 256+0 records out 00:07:30.315 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0110634 s, 94.8 MB/s 00:07:30.315 04:24:09 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:30.315 04:24:09 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:30.315 256+0 records in 00:07:30.315 256+0 records out 00:07:30.315 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0202195 s, 51.9 MB/s 00:07:30.315 04:24:09 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:30.315 04:24:09 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:30.316 256+0 records in 00:07:30.316 256+0 records out 00:07:30.316 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0201722 s, 52.0 MB/s 00:07:30.316 04:24:09 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:30.316 04:24:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:30.316 04:24:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:30.316 04:24:09 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:30.316 04:24:09 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:30.316 04:24:09 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:30.316 04:24:09 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:30.316 04:24:09 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:30.316 04:24:09 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:30.316 04:24:09 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:30.316 04:24:09 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:30.316 04:24:09 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:30.316 04:24:09 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:30.316 04:24:09 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:30.316 04:24:09 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:30.316 04:24:09 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:30.316 04:24:09 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:30.316 04:24:09 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:30.316 04:24:09 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:30.575 04:24:09 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:30.575 04:24:09 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:30.575 04:24:09 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:30.575 04:24:09 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:30.575 04:24:09 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:30.575 04:24:09 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:30.575 04:24:09 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:30.575 04:24:09 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:30.575 04:24:09 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:30.575 04:24:09 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:30.835 04:24:09 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:30.835 04:24:09 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:30.835 04:24:09 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:30.835 04:24:09 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:30.835 04:24:09 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:30.835 04:24:09 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:30.835 04:24:09 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:30.835 04:24:09 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:30.835 04:24:09 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:30.835 04:24:09 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:30.835 04:24:09 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:31.095 04:24:09 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:31.095 04:24:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:31.095 04:24:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:31.096 04:24:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:31.096 04:24:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:31.096 04:24:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:31.096 04:24:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:31.096 04:24:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:31.096 04:24:09 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:31.096 04:24:09 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:31.096 04:24:09 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:31.096 04:24:09 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:31.096 04:24:09 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:31.356 04:24:09 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:31.356 [2024-11-17 04:24:10.135902] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:31.356 [2024-11-17 04:24:10.156981] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.356 [2024-11-17 04:24:10.156982] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:31.616 [2024-11-17 04:24:10.199342] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:31.616 [2024-11-17 04:24:10.199383] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:34.908 04:24:12 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:34.908 04:24:13 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:34.908 spdk_app_start Round 2 00:07:34.908 04:24:13 event.app_repeat -- event/event.sh@25 -- # waitforlisten 140889 /var/tmp/spdk-nbd.sock 00:07:34.908 04:24:13 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 140889 ']' 00:07:34.908 04:24:13 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:34.908 04:24:13 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:34.908 04:24:13 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:34.908 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:34.908 04:24:13 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:34.908 04:24:13 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:34.908 04:24:13 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:34.908 04:24:13 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:34.908 04:24:13 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:34.908 Malloc0 00:07:34.908 04:24:13 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:34.908 Malloc1 00:07:34.908 04:24:13 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:34.908 04:24:13 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:34.908 04:24:13 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:34.908 04:24:13 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:34.908 04:24:13 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:34.908 04:24:13 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:34.908 04:24:13 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:34.908 04:24:13 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:34.908 04:24:13 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:34.908 04:24:13 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:34.908 04:24:13 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:34.908 04:24:13 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:34.908 04:24:13 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:34.908 04:24:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:34.909 04:24:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:34.909 04:24:13 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:35.169 /dev/nbd0 00:07:35.169 04:24:13 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:35.169 04:24:13 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:35.169 04:24:13 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:07:35.169 04:24:13 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:35.169 04:24:13 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:35.169 04:24:13 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:35.169 04:24:13 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:07:35.169 04:24:13 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:35.169 04:24:13 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:35.169 04:24:13 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:35.169 04:24:13 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:35.169 1+0 records in 00:07:35.169 1+0 records out 00:07:35.169 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000228045 s, 18.0 MB/s 00:07:35.169 04:24:13 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:35.169 04:24:13 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:35.169 04:24:13 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:35.169 04:24:13 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:35.169 04:24:13 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:35.169 04:24:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:35.169 04:24:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:35.169 04:24:13 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:35.429 /dev/nbd1 00:07:35.429 04:24:14 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:35.429 04:24:14 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:35.429 04:24:14 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:07:35.429 04:24:14 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:35.429 04:24:14 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:35.429 04:24:14 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:35.429 04:24:14 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:07:35.429 04:24:14 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:35.429 04:24:14 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:35.429 04:24:14 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:35.429 04:24:14 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:35.429 1+0 records in 00:07:35.429 1+0 records out 00:07:35.429 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0002897 s, 14.1 MB/s 00:07:35.429 04:24:14 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:35.429 04:24:14 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:35.429 04:24:14 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:35.429 04:24:14 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:35.429 04:24:14 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:35.429 04:24:14 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:35.429 04:24:14 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:35.429 04:24:14 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:35.429 04:24:14 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:35.429 04:24:14 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:35.690 04:24:14 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:35.690 { 00:07:35.690 "nbd_device": "/dev/nbd0", 00:07:35.690 "bdev_name": "Malloc0" 00:07:35.690 }, 00:07:35.690 { 00:07:35.690 "nbd_device": "/dev/nbd1", 00:07:35.690 "bdev_name": "Malloc1" 00:07:35.690 } 00:07:35.690 ]' 00:07:35.690 04:24:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:35.690 { 00:07:35.690 "nbd_device": "/dev/nbd0", 00:07:35.690 "bdev_name": "Malloc0" 00:07:35.690 }, 00:07:35.690 { 00:07:35.690 "nbd_device": "/dev/nbd1", 00:07:35.690 "bdev_name": "Malloc1" 00:07:35.690 } 00:07:35.690 ]' 00:07:35.690 04:24:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:35.690 04:24:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:35.690 /dev/nbd1' 00:07:35.690 04:24:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:35.690 /dev/nbd1' 00:07:35.690 04:24:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:35.690 04:24:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:35.690 04:24:14 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:35.690 04:24:14 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:35.690 04:24:14 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:35.690 04:24:14 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:35.690 04:24:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:35.691 256+0 records in 00:07:35.691 256+0 records out 00:07:35.691 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104632 s, 100 MB/s 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:35.691 256+0 records in 00:07:35.691 256+0 records out 00:07:35.691 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0204377 s, 51.3 MB/s 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:35.691 256+0 records in 00:07:35.691 256+0 records out 00:07:35.691 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0218768 s, 47.9 MB/s 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:35.691 04:24:14 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:35.951 04:24:14 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:35.951 04:24:14 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:35.951 04:24:14 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:35.951 04:24:14 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:35.951 04:24:14 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:35.951 04:24:14 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:35.951 04:24:14 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:35.951 04:24:14 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:35.951 04:24:14 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:35.951 04:24:14 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:36.210 04:24:14 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:36.210 04:24:14 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:36.210 04:24:14 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:36.210 04:24:14 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:36.210 04:24:14 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:36.210 04:24:14 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:36.210 04:24:14 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:36.210 04:24:14 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:36.210 04:24:14 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:36.210 04:24:14 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:36.210 04:24:14 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:36.471 04:24:15 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:36.471 04:24:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:36.471 04:24:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:36.471 04:24:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:36.471 04:24:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:36.471 04:24:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:36.471 04:24:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:36.471 04:24:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:36.471 04:24:15 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:36.471 04:24:15 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:36.471 04:24:15 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:36.471 04:24:15 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:36.471 04:24:15 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:36.731 04:24:15 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:36.731 [2024-11-17 04:24:15.471638] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:36.731 [2024-11-17 04:24:15.490901] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:36.731 [2024-11-17 04:24:15.490903] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.731 [2024-11-17 04:24:15.531836] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:36.731 [2024-11-17 04:24:15.531881] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:40.027 04:24:18 event.app_repeat -- event/event.sh@38 -- # waitforlisten 140889 /var/tmp/spdk-nbd.sock 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 140889 ']' 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:40.027 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:40.027 04:24:18 event.app_repeat -- event/event.sh@39 -- # killprocess 140889 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@954 -- # '[' -z 140889 ']' 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@958 -- # kill -0 140889 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@959 -- # uname 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 140889 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@972 -- # echo 'killing process with pid 140889' 00:07:40.027 killing process with pid 140889 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@973 -- # kill 140889 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@978 -- # wait 140889 00:07:40.027 spdk_app_start is called in Round 0. 00:07:40.027 Shutdown signal received, stop current app iteration 00:07:40.027 Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 reinitialization... 00:07:40.027 spdk_app_start is called in Round 1. 00:07:40.027 Shutdown signal received, stop current app iteration 00:07:40.027 Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 reinitialization... 00:07:40.027 spdk_app_start is called in Round 2. 00:07:40.027 Shutdown signal received, stop current app iteration 00:07:40.027 Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 reinitialization... 00:07:40.027 spdk_app_start is called in Round 3. 00:07:40.027 Shutdown signal received, stop current app iteration 00:07:40.027 04:24:18 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:40.027 04:24:18 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:40.027 00:07:40.027 real 0m16.496s 00:07:40.027 user 0m35.811s 00:07:40.027 sys 0m3.199s 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:40.027 04:24:18 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:40.027 ************************************ 00:07:40.027 END TEST app_repeat 00:07:40.027 ************************************ 00:07:40.027 04:24:18 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:40.027 04:24:18 event -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:40.027 04:24:18 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:40.027 04:24:18 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:40.027 04:24:18 event -- common/autotest_common.sh@10 -- # set +x 00:07:40.027 ************************************ 00:07:40.027 START TEST cpu_locks 00:07:40.027 ************************************ 00:07:40.027 04:24:18 event.cpu_locks -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:40.288 * Looking for test storage... 00:07:40.288 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:40.288 04:24:18 event.cpu_locks -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:40.288 04:24:18 event.cpu_locks -- common/autotest_common.sh@1693 -- # lcov --version 00:07:40.288 04:24:18 event.cpu_locks -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:40.288 04:24:18 event.cpu_locks -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:40.288 04:24:18 event.cpu_locks -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:40.288 04:24:18 event.cpu_locks -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:40.288 04:24:18 event.cpu_locks -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:40.288 04:24:18 event.cpu_locks -- scripts/common.sh@336 -- # IFS=.-: 00:07:40.288 04:24:18 event.cpu_locks -- scripts/common.sh@336 -- # read -ra ver1 00:07:40.288 04:24:18 event.cpu_locks -- scripts/common.sh@337 -- # IFS=.-: 00:07:40.288 04:24:18 event.cpu_locks -- scripts/common.sh@337 -- # read -ra ver2 00:07:40.288 04:24:18 event.cpu_locks -- scripts/common.sh@338 -- # local 'op=<' 00:07:40.288 04:24:18 event.cpu_locks -- scripts/common.sh@340 -- # ver1_l=2 00:07:40.288 04:24:18 event.cpu_locks -- scripts/common.sh@341 -- # ver2_l=1 00:07:40.288 04:24:18 event.cpu_locks -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:40.288 04:24:18 event.cpu_locks -- scripts/common.sh@344 -- # case "$op" in 00:07:40.288 04:24:18 event.cpu_locks -- scripts/common.sh@345 -- # : 1 00:07:40.288 04:24:18 event.cpu_locks -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:40.288 04:24:19 event.cpu_locks -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:40.288 04:24:19 event.cpu_locks -- scripts/common.sh@365 -- # decimal 1 00:07:40.288 04:24:19 event.cpu_locks -- scripts/common.sh@353 -- # local d=1 00:07:40.288 04:24:19 event.cpu_locks -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:40.288 04:24:19 event.cpu_locks -- scripts/common.sh@355 -- # echo 1 00:07:40.288 04:24:19 event.cpu_locks -- scripts/common.sh@365 -- # ver1[v]=1 00:07:40.288 04:24:19 event.cpu_locks -- scripts/common.sh@366 -- # decimal 2 00:07:40.288 04:24:19 event.cpu_locks -- scripts/common.sh@353 -- # local d=2 00:07:40.288 04:24:19 event.cpu_locks -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:40.288 04:24:19 event.cpu_locks -- scripts/common.sh@355 -- # echo 2 00:07:40.288 04:24:19 event.cpu_locks -- scripts/common.sh@366 -- # ver2[v]=2 00:07:40.288 04:24:19 event.cpu_locks -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:40.288 04:24:19 event.cpu_locks -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:40.288 04:24:19 event.cpu_locks -- scripts/common.sh@368 -- # return 0 00:07:40.288 04:24:19 event.cpu_locks -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:40.288 04:24:19 event.cpu_locks -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:40.288 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:40.288 --rc genhtml_branch_coverage=1 00:07:40.288 --rc genhtml_function_coverage=1 00:07:40.288 --rc genhtml_legend=1 00:07:40.288 --rc geninfo_all_blocks=1 00:07:40.288 --rc geninfo_unexecuted_blocks=1 00:07:40.288 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:40.288 ' 00:07:40.288 04:24:19 event.cpu_locks -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:40.288 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:40.288 --rc genhtml_branch_coverage=1 00:07:40.288 --rc genhtml_function_coverage=1 00:07:40.288 --rc genhtml_legend=1 00:07:40.288 --rc geninfo_all_blocks=1 00:07:40.288 --rc geninfo_unexecuted_blocks=1 00:07:40.288 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:40.288 ' 00:07:40.288 04:24:19 event.cpu_locks -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:40.288 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:40.288 --rc genhtml_branch_coverage=1 00:07:40.288 --rc genhtml_function_coverage=1 00:07:40.288 --rc genhtml_legend=1 00:07:40.288 --rc geninfo_all_blocks=1 00:07:40.289 --rc geninfo_unexecuted_blocks=1 00:07:40.289 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:40.289 ' 00:07:40.289 04:24:19 event.cpu_locks -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:40.289 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:40.289 --rc genhtml_branch_coverage=1 00:07:40.289 --rc genhtml_function_coverage=1 00:07:40.289 --rc genhtml_legend=1 00:07:40.289 --rc geninfo_all_blocks=1 00:07:40.289 --rc geninfo_unexecuted_blocks=1 00:07:40.289 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:40.289 ' 00:07:40.289 04:24:19 event.cpu_locks -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:07:40.289 04:24:19 event.cpu_locks -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:07:40.289 04:24:19 event.cpu_locks -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:07:40.289 04:24:19 event.cpu_locks -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:07:40.289 04:24:19 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:40.289 04:24:19 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:40.289 04:24:19 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:40.289 ************************************ 00:07:40.289 START TEST default_locks 00:07:40.289 ************************************ 00:07:40.289 04:24:19 event.cpu_locks.default_locks -- common/autotest_common.sh@1129 -- # default_locks 00:07:40.289 04:24:19 event.cpu_locks.default_locks -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=144408 00:07:40.289 04:24:19 event.cpu_locks.default_locks -- event/cpu_locks.sh@47 -- # waitforlisten 144408 00:07:40.289 04:24:19 event.cpu_locks.default_locks -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:40.289 04:24:19 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # '[' -z 144408 ']' 00:07:40.289 04:24:19 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:40.289 04:24:19 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:40.289 04:24:19 event.cpu_locks.default_locks -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:40.289 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:40.289 04:24:19 event.cpu_locks.default_locks -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:40.289 04:24:19 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:40.289 [2024-11-17 04:24:19.079431] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:40.289 [2024-11-17 04:24:19.079495] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid144408 ] 00:07:40.558 [2024-11-17 04:24:19.161741] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.559 [2024-11-17 04:24:19.184234] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.825 04:24:19 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:40.825 04:24:19 event.cpu_locks.default_locks -- common/autotest_common.sh@868 -- # return 0 00:07:40.825 04:24:19 event.cpu_locks.default_locks -- event/cpu_locks.sh@49 -- # locks_exist 144408 00:07:40.825 04:24:19 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # lslocks -p 144408 00:07:40.825 04:24:19 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:41.085 lslocks: write error 00:07:41.085 04:24:19 event.cpu_locks.default_locks -- event/cpu_locks.sh@50 -- # killprocess 144408 00:07:41.085 04:24:19 event.cpu_locks.default_locks -- common/autotest_common.sh@954 -- # '[' -z 144408 ']' 00:07:41.085 04:24:19 event.cpu_locks.default_locks -- common/autotest_common.sh@958 -- # kill -0 144408 00:07:41.085 04:24:19 event.cpu_locks.default_locks -- common/autotest_common.sh@959 -- # uname 00:07:41.085 04:24:19 event.cpu_locks.default_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:41.085 04:24:19 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 144408 00:07:41.085 04:24:19 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:41.085 04:24:19 event.cpu_locks.default_locks -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:41.085 04:24:19 event.cpu_locks.default_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 144408' 00:07:41.085 killing process with pid 144408 00:07:41.085 04:24:19 event.cpu_locks.default_locks -- common/autotest_common.sh@973 -- # kill 144408 00:07:41.085 04:24:19 event.cpu_locks.default_locks -- common/autotest_common.sh@978 -- # wait 144408 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- event/cpu_locks.sh@52 -- # NOT waitforlisten 144408 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@652 -- # local es=0 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 144408 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@655 -- # waitforlisten 144408 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # '[' -z 144408 ']' 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:41.345 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:41.345 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (144408) - No such process 00:07:41.345 ERROR: process (pid: 144408) is no longer running 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@868 -- # return 1 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@655 -- # es=1 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- event/cpu_locks.sh@54 -- # no_locks 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # local lock_files 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:41.345 00:07:41.345 real 0m0.995s 00:07:41.345 user 0m0.929s 00:07:41.345 sys 0m0.513s 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:41.345 04:24:20 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:41.345 ************************************ 00:07:41.345 END TEST default_locks 00:07:41.345 ************************************ 00:07:41.345 04:24:20 event.cpu_locks -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:07:41.345 04:24:20 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:41.345 04:24:20 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:41.345 04:24:20 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:41.345 ************************************ 00:07:41.345 START TEST default_locks_via_rpc 00:07:41.345 ************************************ 00:07:41.345 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1129 -- # default_locks_via_rpc 00:07:41.345 04:24:20 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=144551 00:07:41.345 04:24:20 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@63 -- # waitforlisten 144551 00:07:41.345 04:24:20 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:41.345 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 144551 ']' 00:07:41.345 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:41.345 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:41.345 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:41.345 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:41.345 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:41.345 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:41.345 [2024-11-17 04:24:20.156710] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:41.345 [2024-11-17 04:24:20.156768] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid144551 ] 00:07:41.606 [2024-11-17 04:24:20.223182] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.606 [2024-11-17 04:24:20.246235] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.866 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:41.866 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:41.866 04:24:20 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:07:41.866 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:41.866 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:41.866 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:41.866 04:24:20 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@67 -- # no_locks 00:07:41.866 04:24:20 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:41.866 04:24:20 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # local lock_files 00:07:41.866 04:24:20 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:41.866 04:24:20 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:07:41.866 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:41.866 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:41.866 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:41.866 04:24:20 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@71 -- # locks_exist 144551 00:07:41.866 04:24:20 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # lslocks -p 144551 00:07:41.866 04:24:20 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:42.126 04:24:20 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@73 -- # killprocess 144551 00:07:42.127 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@954 -- # '[' -z 144551 ']' 00:07:42.127 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@958 -- # kill -0 144551 00:07:42.127 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@959 -- # uname 00:07:42.127 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:42.127 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 144551 00:07:42.127 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:42.127 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:42.127 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 144551' 00:07:42.127 killing process with pid 144551 00:07:42.127 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@973 -- # kill 144551 00:07:42.127 04:24:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@978 -- # wait 144551 00:07:42.387 00:07:42.387 real 0m1.065s 00:07:42.387 user 0m1.052s 00:07:42.387 sys 0m0.514s 00:07:42.387 04:24:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:42.387 04:24:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:42.387 ************************************ 00:07:42.387 END TEST default_locks_via_rpc 00:07:42.387 ************************************ 00:07:42.647 04:24:21 event.cpu_locks -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:07:42.647 04:24:21 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:42.647 04:24:21 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:42.647 04:24:21 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:42.647 ************************************ 00:07:42.647 START TEST non_locking_app_on_locked_coremask 00:07:42.647 ************************************ 00:07:42.647 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1129 -- # non_locking_app_on_locked_coremask 00:07:42.647 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=144839 00:07:42.647 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@81 -- # waitforlisten 144839 /var/tmp/spdk.sock 00:07:42.647 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:42.647 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 144839 ']' 00:07:42.647 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:42.647 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:42.647 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:42.647 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:42.647 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:42.647 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:42.647 [2024-11-17 04:24:21.308338] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:42.647 [2024-11-17 04:24:21.308402] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid144839 ] 00:07:42.647 [2024-11-17 04:24:21.393720] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.647 [2024-11-17 04:24:21.414652] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.908 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:42.908 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:42.908 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=144846 00:07:42.908 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@85 -- # waitforlisten 144846 /var/tmp/spdk2.sock 00:07:42.908 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:07:42.908 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 144846 ']' 00:07:42.908 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:42.908 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:42.908 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:42.908 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:42.908 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:42.909 04:24:21 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:42.909 [2024-11-17 04:24:21.640454] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:42.909 [2024-11-17 04:24:21.640519] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid144846 ] 00:07:42.909 [2024-11-17 04:24:21.733478] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:42.909 [2024-11-17 04:24:21.733511] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.169 [2024-11-17 04:24:21.779310] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.429 04:24:22 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:43.429 04:24:22 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:43.429 04:24:22 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@87 -- # locks_exist 144839 00:07:43.429 04:24:22 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 144839 00:07:43.429 04:24:22 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:44.810 lslocks: write error 00:07:44.810 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@89 -- # killprocess 144839 00:07:44.810 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 144839 ']' 00:07:44.810 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 144839 00:07:44.810 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:44.810 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:44.810 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 144839 00:07:44.810 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:44.810 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:44.810 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 144839' 00:07:44.810 killing process with pid 144839 00:07:44.810 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 144839 00:07:44.810 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 144839 00:07:45.071 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@90 -- # killprocess 144846 00:07:45.071 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 144846 ']' 00:07:45.071 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 144846 00:07:45.071 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:45.071 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:45.071 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 144846 00:07:45.331 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:45.331 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:45.331 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 144846' 00:07:45.331 killing process with pid 144846 00:07:45.331 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 144846 00:07:45.331 04:24:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 144846 00:07:45.591 00:07:45.591 real 0m2.945s 00:07:45.591 user 0m2.982s 00:07:45.591 sys 0m1.207s 00:07:45.591 04:24:24 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:45.591 04:24:24 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:45.591 ************************************ 00:07:45.591 END TEST non_locking_app_on_locked_coremask 00:07:45.591 ************************************ 00:07:45.591 04:24:24 event.cpu_locks -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:07:45.591 04:24:24 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:45.591 04:24:24 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:45.591 04:24:24 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:45.591 ************************************ 00:07:45.591 START TEST locking_app_on_unlocked_coremask 00:07:45.591 ************************************ 00:07:45.591 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1129 -- # locking_app_on_unlocked_coremask 00:07:45.591 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=145410 00:07:45.591 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@99 -- # waitforlisten 145410 /var/tmp/spdk.sock 00:07:45.591 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:07:45.591 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # '[' -z 145410 ']' 00:07:45.591 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:45.591 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:45.591 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:45.591 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:45.591 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:45.591 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:45.591 [2024-11-17 04:24:24.340391] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:45.591 [2024-11-17 04:24:24.340455] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid145410 ] 00:07:45.851 [2024-11-17 04:24:24.425766] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:45.851 [2024-11-17 04:24:24.425791] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.851 [2024-11-17 04:24:24.448326] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.851 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:45.851 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:45.851 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:45.851 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=145416 00:07:45.851 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@103 -- # waitforlisten 145416 /var/tmp/spdk2.sock 00:07:45.851 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # '[' -z 145416 ']' 00:07:45.851 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:45.851 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:45.851 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:45.851 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:45.851 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:45.851 04:24:24 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:45.851 [2024-11-17 04:24:24.660049] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:45.851 [2024-11-17 04:24:24.660120] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid145416 ] 00:07:46.111 [2024-11-17 04:24:24.753905] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.111 [2024-11-17 04:24:24.795445] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.372 04:24:25 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:46.372 04:24:25 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:46.372 04:24:25 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@105 -- # locks_exist 145416 00:07:46.372 04:24:25 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 145416 00:07:46.372 04:24:25 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:47.753 lslocks: write error 00:07:47.753 04:24:26 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@107 -- # killprocess 145410 00:07:47.753 04:24:26 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' -z 145410 ']' 00:07:47.753 04:24:26 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@958 -- # kill -0 145410 00:07:47.753 04:24:26 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:47.753 04:24:26 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:47.753 04:24:26 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 145410 00:07:47.753 04:24:26 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:47.753 04:24:26 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:47.753 04:24:26 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 145410' 00:07:47.753 killing process with pid 145410 00:07:47.753 04:24:26 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # kill 145410 00:07:47.753 04:24:26 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@978 -- # wait 145410 00:07:48.323 04:24:27 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@108 -- # killprocess 145416 00:07:48.323 04:24:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' -z 145416 ']' 00:07:48.323 04:24:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@958 -- # kill -0 145416 00:07:48.323 04:24:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:48.323 04:24:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:48.323 04:24:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 145416 00:07:48.323 04:24:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:48.323 04:24:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:48.323 04:24:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 145416' 00:07:48.323 killing process with pid 145416 00:07:48.323 04:24:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # kill 145416 00:07:48.323 04:24:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@978 -- # wait 145416 00:07:48.583 00:07:48.583 real 0m3.046s 00:07:48.583 user 0m3.071s 00:07:48.583 sys 0m1.293s 00:07:48.583 04:24:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:48.583 04:24:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:48.583 ************************************ 00:07:48.583 END TEST locking_app_on_unlocked_coremask 00:07:48.583 ************************************ 00:07:48.583 04:24:27 event.cpu_locks -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:07:48.583 04:24:27 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:48.583 04:24:27 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:48.583 04:24:27 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:48.844 ************************************ 00:07:48.844 START TEST locking_app_on_locked_coremask 00:07:48.844 ************************************ 00:07:48.844 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1129 -- # locking_app_on_locked_coremask 00:07:48.844 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=145976 00:07:48.844 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@116 -- # waitforlisten 145976 /var/tmp/spdk.sock 00:07:48.844 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:48.844 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 145976 ']' 00:07:48.844 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:48.844 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:48.844 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:48.844 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:48.844 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:48.844 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:48.844 [2024-11-17 04:24:27.471558] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:48.844 [2024-11-17 04:24:27.471623] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid145976 ] 00:07:48.844 [2024-11-17 04:24:27.557801] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.844 [2024-11-17 04:24:27.577936] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.104 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:49.104 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:49.104 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=145985 00:07:49.104 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:49.104 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@120 -- # NOT waitforlisten 145985 /var/tmp/spdk2.sock 00:07:49.104 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@652 -- # local es=0 00:07:49.104 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 145985 /var/tmp/spdk2.sock 00:07:49.104 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:07:49.104 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:49.104 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:07:49.104 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:49.104 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@655 -- # waitforlisten 145985 /var/tmp/spdk2.sock 00:07:49.104 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 145985 ']' 00:07:49.104 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:49.104 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:49.104 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:49.104 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:49.104 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:49.104 04:24:27 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:49.104 [2024-11-17 04:24:27.805203] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:49.104 [2024-11-17 04:24:27.805268] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid145985 ] 00:07:49.104 [2024-11-17 04:24:27.901782] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 145976 has claimed it. 00:07:49.104 [2024-11-17 04:24:27.901825] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:49.673 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (145985) - No such process 00:07:49.673 ERROR: process (pid: 145985) is no longer running 00:07:49.673 04:24:28 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:49.673 04:24:28 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 1 00:07:49.673 04:24:28 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@655 -- # es=1 00:07:49.673 04:24:28 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:49.673 04:24:28 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:49.673 04:24:28 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:49.673 04:24:28 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@122 -- # locks_exist 145976 00:07:49.673 04:24:28 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 145976 00:07:49.673 04:24:28 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:50.242 lslocks: write error 00:07:50.242 04:24:29 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@124 -- # killprocess 145976 00:07:50.242 04:24:29 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 145976 ']' 00:07:50.242 04:24:29 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 145976 00:07:50.242 04:24:29 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:50.242 04:24:29 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:50.242 04:24:29 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 145976 00:07:50.502 04:24:29 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:50.502 04:24:29 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:50.502 04:24:29 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 145976' 00:07:50.502 killing process with pid 145976 00:07:50.502 04:24:29 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 145976 00:07:50.502 04:24:29 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 145976 00:07:50.763 00:07:50.763 real 0m1.919s 00:07:50.763 user 0m2.034s 00:07:50.763 sys 0m0.746s 00:07:50.763 04:24:29 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:50.763 04:24:29 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:50.763 ************************************ 00:07:50.763 END TEST locking_app_on_locked_coremask 00:07:50.763 ************************************ 00:07:50.763 04:24:29 event.cpu_locks -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:07:50.763 04:24:29 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:50.763 04:24:29 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:50.763 04:24:29 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:50.763 ************************************ 00:07:50.763 START TEST locking_overlapped_coremask 00:07:50.763 ************************************ 00:07:50.763 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1129 -- # locking_overlapped_coremask 00:07:50.763 04:24:29 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=146287 00:07:50.763 04:24:29 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@133 -- # waitforlisten 146287 /var/tmp/spdk.sock 00:07:50.763 04:24:29 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:07:50.763 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # '[' -z 146287 ']' 00:07:50.763 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:50.763 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:50.763 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:50.763 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:50.763 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:50.763 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:50.763 [2024-11-17 04:24:29.476138] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:50.763 [2024-11-17 04:24:29.476201] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid146287 ] 00:07:50.763 [2024-11-17 04:24:29.561560] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:50.763 [2024-11-17 04:24:29.584519] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:50.763 [2024-11-17 04:24:29.584627] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.763 [2024-11-17 04:24:29.584628] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:51.022 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:51.022 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:51.022 04:24:29 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:07:51.022 04:24:29 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=146377 00:07:51.022 04:24:29 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@137 -- # NOT waitforlisten 146377 /var/tmp/spdk2.sock 00:07:51.022 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@652 -- # local es=0 00:07:51.022 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 146377 /var/tmp/spdk2.sock 00:07:51.022 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:07:51.022 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:51.022 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:07:51.022 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:51.022 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@655 -- # waitforlisten 146377 /var/tmp/spdk2.sock 00:07:51.022 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # '[' -z 146377 ']' 00:07:51.022 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:51.022 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:51.022 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:51.022 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:51.022 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:51.022 04:24:29 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:51.022 [2024-11-17 04:24:29.802126] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:51.022 [2024-11-17 04:24:29.802177] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid146377 ] 00:07:51.281 [2024-11-17 04:24:29.899958] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 146287 has claimed it. 00:07:51.281 [2024-11-17 04:24:29.899998] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:51.852 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (146377) - No such process 00:07:51.852 ERROR: process (pid: 146377) is no longer running 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@868 -- # return 1 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@655 -- # es=1 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@141 -- # killprocess 146287 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@954 -- # '[' -z 146287 ']' 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@958 -- # kill -0 146287 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@959 -- # uname 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 146287 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 146287' 00:07:51.852 killing process with pid 146287 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@973 -- # kill 146287 00:07:51.852 04:24:30 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@978 -- # wait 146287 00:07:52.113 00:07:52.113 real 0m1.392s 00:07:52.113 user 0m3.844s 00:07:52.113 sys 0m0.449s 00:07:52.113 04:24:30 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:52.113 04:24:30 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:52.113 ************************************ 00:07:52.113 END TEST locking_overlapped_coremask 00:07:52.113 ************************************ 00:07:52.113 04:24:30 event.cpu_locks -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:07:52.113 04:24:30 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:52.113 04:24:30 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:52.113 04:24:30 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:52.113 ************************************ 00:07:52.113 START TEST locking_overlapped_coremask_via_rpc 00:07:52.113 ************************************ 00:07:52.113 04:24:30 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1129 -- # locking_overlapped_coremask_via_rpc 00:07:52.113 04:24:30 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=146582 00:07:52.113 04:24:30 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@149 -- # waitforlisten 146582 /var/tmp/spdk.sock 00:07:52.113 04:24:30 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:07:52.113 04:24:30 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 146582 ']' 00:07:52.113 04:24:30 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:52.113 04:24:30 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:52.113 04:24:30 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:52.113 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:52.113 04:24:30 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:52.113 04:24:30 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:52.374 [2024-11-17 04:24:30.949042] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:52.374 [2024-11-17 04:24:30.949117] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid146582 ] 00:07:52.374 [2024-11-17 04:24:31.036341] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:52.374 [2024-11-17 04:24:31.036367] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:52.374 [2024-11-17 04:24:31.061723] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:52.374 [2024-11-17 04:24:31.061847] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.374 [2024-11-17 04:24:31.061848] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:52.634 04:24:31 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:52.634 04:24:31 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:52.634 04:24:31 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=146638 00:07:52.634 04:24:31 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@153 -- # waitforlisten 146638 /var/tmp/spdk2.sock 00:07:52.635 04:24:31 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:07:52.635 04:24:31 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 146638 ']' 00:07:52.635 04:24:31 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:52.635 04:24:31 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:52.635 04:24:31 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:52.635 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:52.635 04:24:31 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:52.635 04:24:31 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:52.635 [2024-11-17 04:24:31.284250] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:52.635 [2024-11-17 04:24:31.284339] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid146638 ] 00:07:52.635 [2024-11-17 04:24:31.385521] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:52.635 [2024-11-17 04:24:31.385551] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:52.635 [2024-11-17 04:24:31.434634] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:52.635 [2024-11-17 04:24:31.437742] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:52.635 [2024-11-17 04:24:31.437744] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 4 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@652 -- # local es=0 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@654 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@640 -- # local arg=rpc_cmd 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # type -t rpc_cmd 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@655 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:53.575 [2024-11-17 04:24:32.171755] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 146582 has claimed it. 00:07:53.575 request: 00:07:53.575 { 00:07:53.575 "method": "framework_enable_cpumask_locks", 00:07:53.575 "req_id": 1 00:07:53.575 } 00:07:53.575 Got JSON-RPC error response 00:07:53.575 response: 00:07:53.575 { 00:07:53.575 "code": -32603, 00:07:53.575 "message": "Failed to claim CPU core: 2" 00:07:53.575 } 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@655 -- # es=1 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@158 -- # waitforlisten 146582 /var/tmp/spdk.sock 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 146582 ']' 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:53.575 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@159 -- # waitforlisten 146638 /var/tmp/spdk2.sock 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 146638 ']' 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:53.575 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:53.575 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:53.835 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:53.835 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:53.835 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:07:53.835 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:53.835 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:53.836 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:53.836 00:07:53.836 real 0m1.666s 00:07:53.836 user 0m0.816s 00:07:53.836 sys 0m0.155s 00:07:53.836 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:53.836 04:24:32 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:53.836 ************************************ 00:07:53.836 END TEST locking_overlapped_coremask_via_rpc 00:07:53.836 ************************************ 00:07:53.836 04:24:32 event.cpu_locks -- event/cpu_locks.sh@174 -- # cleanup 00:07:53.836 04:24:32 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 146582 ]] 00:07:53.836 04:24:32 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 146582 00:07:53.836 04:24:32 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 146582 ']' 00:07:53.836 04:24:32 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 146582 00:07:53.836 04:24:32 event.cpu_locks -- common/autotest_common.sh@959 -- # uname 00:07:53.836 04:24:32 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:53.836 04:24:32 event.cpu_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 146582 00:07:54.096 04:24:32 event.cpu_locks -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:54.096 04:24:32 event.cpu_locks -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:54.096 04:24:32 event.cpu_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 146582' 00:07:54.096 killing process with pid 146582 00:07:54.096 04:24:32 event.cpu_locks -- common/autotest_common.sh@973 -- # kill 146582 00:07:54.096 04:24:32 event.cpu_locks -- common/autotest_common.sh@978 -- # wait 146582 00:07:54.357 04:24:32 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 146638 ]] 00:07:54.357 04:24:32 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 146638 00:07:54.357 04:24:32 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 146638 ']' 00:07:54.357 04:24:32 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 146638 00:07:54.357 04:24:32 event.cpu_locks -- common/autotest_common.sh@959 -- # uname 00:07:54.357 04:24:32 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:54.357 04:24:32 event.cpu_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 146638 00:07:54.357 04:24:33 event.cpu_locks -- common/autotest_common.sh@960 -- # process_name=reactor_2 00:07:54.357 04:24:33 event.cpu_locks -- common/autotest_common.sh@964 -- # '[' reactor_2 = sudo ']' 00:07:54.357 04:24:33 event.cpu_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 146638' 00:07:54.357 killing process with pid 146638 00:07:54.357 04:24:33 event.cpu_locks -- common/autotest_common.sh@973 -- # kill 146638 00:07:54.357 04:24:33 event.cpu_locks -- common/autotest_common.sh@978 -- # wait 146638 00:07:54.618 04:24:33 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:54.618 04:24:33 event.cpu_locks -- event/cpu_locks.sh@1 -- # cleanup 00:07:54.618 04:24:33 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 146582 ]] 00:07:54.618 04:24:33 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 146582 00:07:54.618 04:24:33 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 146582 ']' 00:07:54.618 04:24:33 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 146582 00:07:54.618 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 958: kill: (146582) - No such process 00:07:54.618 04:24:33 event.cpu_locks -- common/autotest_common.sh@981 -- # echo 'Process with pid 146582 is not found' 00:07:54.618 Process with pid 146582 is not found 00:07:54.618 04:24:33 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 146638 ]] 00:07:54.618 04:24:33 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 146638 00:07:54.618 04:24:33 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 146638 ']' 00:07:54.618 04:24:33 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 146638 00:07:54.618 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 958: kill: (146638) - No such process 00:07:54.618 04:24:33 event.cpu_locks -- common/autotest_common.sh@981 -- # echo 'Process with pid 146638 is not found' 00:07:54.618 Process with pid 146638 is not found 00:07:54.618 04:24:33 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:54.618 00:07:54.618 real 0m14.535s 00:07:54.618 user 0m24.560s 00:07:54.618 sys 0m5.995s 00:07:54.618 04:24:33 event.cpu_locks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:54.618 04:24:33 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:54.618 ************************************ 00:07:54.618 END TEST cpu_locks 00:07:54.618 ************************************ 00:07:54.618 00:07:54.618 real 0m39.716s 00:07:54.618 user 1m14.626s 00:07:54.618 sys 0m10.456s 00:07:54.618 04:24:33 event -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:54.618 04:24:33 event -- common/autotest_common.sh@10 -- # set +x 00:07:54.618 ************************************ 00:07:54.618 END TEST event 00:07:54.618 ************************************ 00:07:54.618 04:24:33 -- spdk/autotest.sh@169 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:54.618 04:24:33 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:54.618 04:24:33 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:54.618 04:24:33 -- common/autotest_common.sh@10 -- # set +x 00:07:54.878 ************************************ 00:07:54.878 START TEST thread 00:07:54.878 ************************************ 00:07:54.878 04:24:33 thread -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:54.878 * Looking for test storage... 00:07:54.878 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:07:54.878 04:24:33 thread -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:54.878 04:24:33 thread -- common/autotest_common.sh@1693 -- # lcov --version 00:07:54.878 04:24:33 thread -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:54.878 04:24:33 thread -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:54.878 04:24:33 thread -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:54.878 04:24:33 thread -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:54.878 04:24:33 thread -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:54.878 04:24:33 thread -- scripts/common.sh@336 -- # IFS=.-: 00:07:54.878 04:24:33 thread -- scripts/common.sh@336 -- # read -ra ver1 00:07:54.878 04:24:33 thread -- scripts/common.sh@337 -- # IFS=.-: 00:07:54.878 04:24:33 thread -- scripts/common.sh@337 -- # read -ra ver2 00:07:54.878 04:24:33 thread -- scripts/common.sh@338 -- # local 'op=<' 00:07:54.878 04:24:33 thread -- scripts/common.sh@340 -- # ver1_l=2 00:07:54.878 04:24:33 thread -- scripts/common.sh@341 -- # ver2_l=1 00:07:54.878 04:24:33 thread -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:54.878 04:24:33 thread -- scripts/common.sh@344 -- # case "$op" in 00:07:54.878 04:24:33 thread -- scripts/common.sh@345 -- # : 1 00:07:54.878 04:24:33 thread -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:54.878 04:24:33 thread -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:54.878 04:24:33 thread -- scripts/common.sh@365 -- # decimal 1 00:07:54.878 04:24:33 thread -- scripts/common.sh@353 -- # local d=1 00:07:54.878 04:24:33 thread -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:54.878 04:24:33 thread -- scripts/common.sh@355 -- # echo 1 00:07:54.878 04:24:33 thread -- scripts/common.sh@365 -- # ver1[v]=1 00:07:54.878 04:24:33 thread -- scripts/common.sh@366 -- # decimal 2 00:07:54.878 04:24:33 thread -- scripts/common.sh@353 -- # local d=2 00:07:54.878 04:24:33 thread -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:54.878 04:24:33 thread -- scripts/common.sh@355 -- # echo 2 00:07:54.878 04:24:33 thread -- scripts/common.sh@366 -- # ver2[v]=2 00:07:54.878 04:24:33 thread -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:54.878 04:24:33 thread -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:54.878 04:24:33 thread -- scripts/common.sh@368 -- # return 0 00:07:54.878 04:24:33 thread -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:54.878 04:24:33 thread -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:54.878 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:54.878 --rc genhtml_branch_coverage=1 00:07:54.878 --rc genhtml_function_coverage=1 00:07:54.878 --rc genhtml_legend=1 00:07:54.878 --rc geninfo_all_blocks=1 00:07:54.878 --rc geninfo_unexecuted_blocks=1 00:07:54.878 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:54.878 ' 00:07:54.878 04:24:33 thread -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:54.878 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:54.878 --rc genhtml_branch_coverage=1 00:07:54.878 --rc genhtml_function_coverage=1 00:07:54.878 --rc genhtml_legend=1 00:07:54.878 --rc geninfo_all_blocks=1 00:07:54.878 --rc geninfo_unexecuted_blocks=1 00:07:54.878 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:54.878 ' 00:07:54.878 04:24:33 thread -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:54.878 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:54.878 --rc genhtml_branch_coverage=1 00:07:54.878 --rc genhtml_function_coverage=1 00:07:54.878 --rc genhtml_legend=1 00:07:54.878 --rc geninfo_all_blocks=1 00:07:54.878 --rc geninfo_unexecuted_blocks=1 00:07:54.878 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:54.878 ' 00:07:54.878 04:24:33 thread -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:54.878 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:54.878 --rc genhtml_branch_coverage=1 00:07:54.878 --rc genhtml_function_coverage=1 00:07:54.878 --rc genhtml_legend=1 00:07:54.878 --rc geninfo_all_blocks=1 00:07:54.878 --rc geninfo_unexecuted_blocks=1 00:07:54.878 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:54.878 ' 00:07:54.878 04:24:33 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:54.878 04:24:33 thread -- common/autotest_common.sh@1105 -- # '[' 8 -le 1 ']' 00:07:54.878 04:24:33 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:54.878 04:24:33 thread -- common/autotest_common.sh@10 -- # set +x 00:07:55.138 ************************************ 00:07:55.138 START TEST thread_poller_perf 00:07:55.139 ************************************ 00:07:55.139 04:24:33 thread.thread_poller_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:55.139 [2024-11-17 04:24:33.734177] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:55.139 [2024-11-17 04:24:33.734258] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid147226 ] 00:07:55.139 [2024-11-17 04:24:33.819894] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.139 [2024-11-17 04:24:33.842077] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.139 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:56.079 [2024-11-17T03:24:34.910Z] ====================================== 00:07:56.079 [2024-11-17T03:24:34.910Z] busy:2503819648 (cyc) 00:07:56.079 [2024-11-17T03:24:34.910Z] total_run_count: 850000 00:07:56.079 [2024-11-17T03:24:34.910Z] tsc_hz: 2500000000 (cyc) 00:07:56.079 [2024-11-17T03:24:34.910Z] ====================================== 00:07:56.079 [2024-11-17T03:24:34.910Z] poller_cost: 2945 (cyc), 1178 (nsec) 00:07:56.079 00:07:56.079 real 0m1.158s 00:07:56.079 user 0m1.057s 00:07:56.079 sys 0m0.096s 00:07:56.079 04:24:34 thread.thread_poller_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:56.079 04:24:34 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:56.079 ************************************ 00:07:56.079 END TEST thread_poller_perf 00:07:56.079 ************************************ 00:07:56.340 04:24:34 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:56.340 04:24:34 thread -- common/autotest_common.sh@1105 -- # '[' 8 -le 1 ']' 00:07:56.340 04:24:34 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:56.340 04:24:34 thread -- common/autotest_common.sh@10 -- # set +x 00:07:56.340 ************************************ 00:07:56.340 START TEST thread_poller_perf 00:07:56.340 ************************************ 00:07:56.340 04:24:34 thread.thread_poller_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:56.340 [2024-11-17 04:24:34.978489] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:56.340 [2024-11-17 04:24:34.978573] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid147508 ] 00:07:56.340 [2024-11-17 04:24:35.063135] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.340 [2024-11-17 04:24:35.084665] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.340 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:57.723 [2024-11-17T03:24:36.554Z] ====================================== 00:07:57.723 [2024-11-17T03:24:36.554Z] busy:2501388338 (cyc) 00:07:57.723 [2024-11-17T03:24:36.554Z] total_run_count: 13187000 00:07:57.723 [2024-11-17T03:24:36.554Z] tsc_hz: 2500000000 (cyc) 00:07:57.723 [2024-11-17T03:24:36.554Z] ====================================== 00:07:57.723 [2024-11-17T03:24:36.554Z] poller_cost: 189 (cyc), 75 (nsec) 00:07:57.723 00:07:57.723 real 0m1.154s 00:07:57.723 user 0m1.061s 00:07:57.723 sys 0m0.088s 00:07:57.723 04:24:36 thread.thread_poller_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:57.723 04:24:36 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:57.723 ************************************ 00:07:57.723 END TEST thread_poller_perf 00:07:57.723 ************************************ 00:07:57.723 04:24:36 thread -- thread/thread.sh@17 -- # [[ n != \y ]] 00:07:57.723 04:24:36 thread -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:57.723 04:24:36 thread -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:57.723 04:24:36 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:57.723 04:24:36 thread -- common/autotest_common.sh@10 -- # set +x 00:07:57.723 ************************************ 00:07:57.723 START TEST thread_spdk_lock 00:07:57.723 ************************************ 00:07:57.723 04:24:36 thread.thread_spdk_lock -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:57.723 [2024-11-17 04:24:36.222071] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:57.723 [2024-11-17 04:24:36.222157] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid147718 ] 00:07:57.723 [2024-11-17 04:24:36.310948] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:57.723 [2024-11-17 04:24:36.338935] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.723 [2024-11-17 04:24:36.338935] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:58.294 [2024-11-17 04:24:36.833397] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 980:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:58.294 [2024-11-17 04:24:36.833433] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3112:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:07:58.294 [2024-11-17 04:24:36.833444] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3067:sspin_stacks_print: *ERROR*: spinlock 0x131e040 00:07:58.294 [2024-11-17 04:24:36.834141] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 875:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:58.294 [2024-11-17 04:24:36.834247] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1041:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:58.294 [2024-11-17 04:24:36.834266] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 875:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:58.294 Starting test contend 00:07:58.294 Worker Delay Wait us Hold us Total us 00:07:58.294 0 3 166642 187960 354602 00:07:58.294 1 5 81105 289576 370681 00:07:58.294 PASS test contend 00:07:58.294 Starting test hold_by_poller 00:07:58.294 PASS test hold_by_poller 00:07:58.294 Starting test hold_by_message 00:07:58.294 PASS test hold_by_message 00:07:58.294 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:07:58.294 100014 assertions passed 00:07:58.294 0 assertions failed 00:07:58.294 00:07:58.294 real 0m0.658s 00:07:58.294 user 0m1.055s 00:07:58.294 sys 0m0.095s 00:07:58.294 04:24:36 thread.thread_spdk_lock -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:58.294 04:24:36 thread.thread_spdk_lock -- common/autotest_common.sh@10 -- # set +x 00:07:58.294 ************************************ 00:07:58.294 END TEST thread_spdk_lock 00:07:58.294 ************************************ 00:07:58.294 00:07:58.294 real 0m3.428s 00:07:58.294 user 0m3.369s 00:07:58.294 sys 0m0.579s 00:07:58.294 04:24:36 thread -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:58.294 04:24:36 thread -- common/autotest_common.sh@10 -- # set +x 00:07:58.294 ************************************ 00:07:58.294 END TEST thread 00:07:58.294 ************************************ 00:07:58.294 04:24:36 -- spdk/autotest.sh@171 -- # [[ 0 -eq 1 ]] 00:07:58.294 04:24:36 -- spdk/autotest.sh@176 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:58.294 04:24:36 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:58.294 04:24:36 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:58.294 04:24:36 -- common/autotest_common.sh@10 -- # set +x 00:07:58.294 ************************************ 00:07:58.294 START TEST app_cmdline 00:07:58.294 ************************************ 00:07:58.294 04:24:36 app_cmdline -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:58.294 * Looking for test storage... 00:07:58.294 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:58.294 04:24:37 app_cmdline -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:58.294 04:24:37 app_cmdline -- common/autotest_common.sh@1693 -- # lcov --version 00:07:58.294 04:24:37 app_cmdline -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:58.555 04:24:37 app_cmdline -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@336 -- # IFS=.-: 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@336 -- # read -ra ver1 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@337 -- # IFS=.-: 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@337 -- # read -ra ver2 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@338 -- # local 'op=<' 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@340 -- # ver1_l=2 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@341 -- # ver2_l=1 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@344 -- # case "$op" in 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@345 -- # : 1 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@365 -- # decimal 1 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@353 -- # local d=1 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@355 -- # echo 1 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@365 -- # ver1[v]=1 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@366 -- # decimal 2 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@353 -- # local d=2 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@355 -- # echo 2 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@366 -- # ver2[v]=2 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:58.555 04:24:37 app_cmdline -- scripts/common.sh@368 -- # return 0 00:07:58.555 04:24:37 app_cmdline -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:58.555 04:24:37 app_cmdline -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:58.555 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:58.555 --rc genhtml_branch_coverage=1 00:07:58.555 --rc genhtml_function_coverage=1 00:07:58.555 --rc genhtml_legend=1 00:07:58.555 --rc geninfo_all_blocks=1 00:07:58.555 --rc geninfo_unexecuted_blocks=1 00:07:58.555 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:58.555 ' 00:07:58.555 04:24:37 app_cmdline -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:58.555 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:58.555 --rc genhtml_branch_coverage=1 00:07:58.555 --rc genhtml_function_coverage=1 00:07:58.555 --rc genhtml_legend=1 00:07:58.555 --rc geninfo_all_blocks=1 00:07:58.555 --rc geninfo_unexecuted_blocks=1 00:07:58.555 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:58.555 ' 00:07:58.555 04:24:37 app_cmdline -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:58.555 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:58.555 --rc genhtml_branch_coverage=1 00:07:58.555 --rc genhtml_function_coverage=1 00:07:58.555 --rc genhtml_legend=1 00:07:58.555 --rc geninfo_all_blocks=1 00:07:58.555 --rc geninfo_unexecuted_blocks=1 00:07:58.555 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:58.555 ' 00:07:58.555 04:24:37 app_cmdline -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:58.555 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:58.555 --rc genhtml_branch_coverage=1 00:07:58.555 --rc genhtml_function_coverage=1 00:07:58.555 --rc genhtml_legend=1 00:07:58.555 --rc geninfo_all_blocks=1 00:07:58.555 --rc geninfo_unexecuted_blocks=1 00:07:58.555 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:58.555 ' 00:07:58.555 04:24:37 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:58.555 04:24:37 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=147873 00:07:58.555 04:24:37 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 147873 00:07:58.555 04:24:37 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:58.556 04:24:37 app_cmdline -- common/autotest_common.sh@835 -- # '[' -z 147873 ']' 00:07:58.556 04:24:37 app_cmdline -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:58.556 04:24:37 app_cmdline -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:58.556 04:24:37 app_cmdline -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:58.556 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:58.556 04:24:37 app_cmdline -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:58.556 04:24:37 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:58.556 [2024-11-17 04:24:37.215240] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:07:58.556 [2024-11-17 04:24:37.215332] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid147873 ] 00:07:58.556 [2024-11-17 04:24:37.299997] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.556 [2024-11-17 04:24:37.321389] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.816 04:24:37 app_cmdline -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:58.816 04:24:37 app_cmdline -- common/autotest_common.sh@868 -- # return 0 00:07:58.816 04:24:37 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:59.076 { 00:07:59.076 "version": "SPDK v25.01-pre git sha1 83e8405e4", 00:07:59.076 "fields": { 00:07:59.076 "major": 25, 00:07:59.076 "minor": 1, 00:07:59.076 "patch": 0, 00:07:59.076 "suffix": "-pre", 00:07:59.076 "commit": "83e8405e4" 00:07:59.076 } 00:07:59.076 } 00:07:59.076 04:24:37 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:07:59.076 04:24:37 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:59.076 04:24:37 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:59.076 04:24:37 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:59.076 04:24:37 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:59.076 04:24:37 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:59.076 04:24:37 app_cmdline -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:59.076 04:24:37 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:59.076 04:24:37 app_cmdline -- app/cmdline.sh@26 -- # sort 00:07:59.076 04:24:37 app_cmdline -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:59.076 04:24:37 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:59.076 04:24:37 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:59.076 04:24:37 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:59.076 04:24:37 app_cmdline -- common/autotest_common.sh@652 -- # local es=0 00:07:59.076 04:24:37 app_cmdline -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:59.076 04:24:37 app_cmdline -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:59.076 04:24:37 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:59.076 04:24:37 app_cmdline -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:59.076 04:24:37 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:59.076 04:24:37 app_cmdline -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:59.076 04:24:37 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:59.076 04:24:37 app_cmdline -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:59.076 04:24:37 app_cmdline -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:59.076 04:24:37 app_cmdline -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:59.336 request: 00:07:59.336 { 00:07:59.336 "method": "env_dpdk_get_mem_stats", 00:07:59.336 "req_id": 1 00:07:59.336 } 00:07:59.336 Got JSON-RPC error response 00:07:59.336 response: 00:07:59.336 { 00:07:59.336 "code": -32601, 00:07:59.336 "message": "Method not found" 00:07:59.336 } 00:07:59.336 04:24:37 app_cmdline -- common/autotest_common.sh@655 -- # es=1 00:07:59.336 04:24:37 app_cmdline -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:59.336 04:24:37 app_cmdline -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:59.336 04:24:37 app_cmdline -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:59.336 04:24:37 app_cmdline -- app/cmdline.sh@1 -- # killprocess 147873 00:07:59.336 04:24:37 app_cmdline -- common/autotest_common.sh@954 -- # '[' -z 147873 ']' 00:07:59.336 04:24:37 app_cmdline -- common/autotest_common.sh@958 -- # kill -0 147873 00:07:59.336 04:24:37 app_cmdline -- common/autotest_common.sh@959 -- # uname 00:07:59.336 04:24:37 app_cmdline -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:59.336 04:24:37 app_cmdline -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 147873 00:07:59.336 04:24:38 app_cmdline -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:59.336 04:24:38 app_cmdline -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:59.336 04:24:38 app_cmdline -- common/autotest_common.sh@972 -- # echo 'killing process with pid 147873' 00:07:59.336 killing process with pid 147873 00:07:59.336 04:24:38 app_cmdline -- common/autotest_common.sh@973 -- # kill 147873 00:07:59.336 04:24:38 app_cmdline -- common/autotest_common.sh@978 -- # wait 147873 00:07:59.597 00:07:59.597 real 0m1.333s 00:07:59.597 user 0m1.533s 00:07:59.597 sys 0m0.497s 00:07:59.597 04:24:38 app_cmdline -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:59.597 04:24:38 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:59.597 ************************************ 00:07:59.597 END TEST app_cmdline 00:07:59.597 ************************************ 00:07:59.597 04:24:38 -- spdk/autotest.sh@177 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:59.597 04:24:38 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:59.597 04:24:38 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:59.597 04:24:38 -- common/autotest_common.sh@10 -- # set +x 00:07:59.597 ************************************ 00:07:59.597 START TEST version 00:07:59.597 ************************************ 00:07:59.597 04:24:38 version -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:59.858 * Looking for test storage... 00:07:59.858 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:59.858 04:24:38 version -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:59.858 04:24:38 version -- common/autotest_common.sh@1693 -- # lcov --version 00:07:59.858 04:24:38 version -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:59.858 04:24:38 version -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:59.858 04:24:38 version -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:59.858 04:24:38 version -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:59.858 04:24:38 version -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:59.858 04:24:38 version -- scripts/common.sh@336 -- # IFS=.-: 00:07:59.858 04:24:38 version -- scripts/common.sh@336 -- # read -ra ver1 00:07:59.858 04:24:38 version -- scripts/common.sh@337 -- # IFS=.-: 00:07:59.858 04:24:38 version -- scripts/common.sh@337 -- # read -ra ver2 00:07:59.858 04:24:38 version -- scripts/common.sh@338 -- # local 'op=<' 00:07:59.858 04:24:38 version -- scripts/common.sh@340 -- # ver1_l=2 00:07:59.858 04:24:38 version -- scripts/common.sh@341 -- # ver2_l=1 00:07:59.858 04:24:38 version -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:59.858 04:24:38 version -- scripts/common.sh@344 -- # case "$op" in 00:07:59.858 04:24:38 version -- scripts/common.sh@345 -- # : 1 00:07:59.858 04:24:38 version -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:59.858 04:24:38 version -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:59.858 04:24:38 version -- scripts/common.sh@365 -- # decimal 1 00:07:59.858 04:24:38 version -- scripts/common.sh@353 -- # local d=1 00:07:59.858 04:24:38 version -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:59.858 04:24:38 version -- scripts/common.sh@355 -- # echo 1 00:07:59.858 04:24:38 version -- scripts/common.sh@365 -- # ver1[v]=1 00:07:59.858 04:24:38 version -- scripts/common.sh@366 -- # decimal 2 00:07:59.858 04:24:38 version -- scripts/common.sh@353 -- # local d=2 00:07:59.858 04:24:38 version -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:59.858 04:24:38 version -- scripts/common.sh@355 -- # echo 2 00:07:59.858 04:24:38 version -- scripts/common.sh@366 -- # ver2[v]=2 00:07:59.858 04:24:38 version -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:59.858 04:24:38 version -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:59.858 04:24:38 version -- scripts/common.sh@368 -- # return 0 00:07:59.858 04:24:38 version -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:59.858 04:24:38 version -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:59.858 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:59.858 --rc genhtml_branch_coverage=1 00:07:59.858 --rc genhtml_function_coverage=1 00:07:59.858 --rc genhtml_legend=1 00:07:59.858 --rc geninfo_all_blocks=1 00:07:59.858 --rc geninfo_unexecuted_blocks=1 00:07:59.858 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:59.858 ' 00:07:59.858 04:24:38 version -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:59.858 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:59.858 --rc genhtml_branch_coverage=1 00:07:59.858 --rc genhtml_function_coverage=1 00:07:59.858 --rc genhtml_legend=1 00:07:59.858 --rc geninfo_all_blocks=1 00:07:59.858 --rc geninfo_unexecuted_blocks=1 00:07:59.858 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:59.858 ' 00:07:59.858 04:24:38 version -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:59.858 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:59.858 --rc genhtml_branch_coverage=1 00:07:59.858 --rc genhtml_function_coverage=1 00:07:59.858 --rc genhtml_legend=1 00:07:59.858 --rc geninfo_all_blocks=1 00:07:59.858 --rc geninfo_unexecuted_blocks=1 00:07:59.858 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:59.858 ' 00:07:59.858 04:24:38 version -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:59.858 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:59.858 --rc genhtml_branch_coverage=1 00:07:59.858 --rc genhtml_function_coverage=1 00:07:59.858 --rc genhtml_legend=1 00:07:59.858 --rc geninfo_all_blocks=1 00:07:59.858 --rc geninfo_unexecuted_blocks=1 00:07:59.858 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:59.858 ' 00:07:59.858 04:24:38 version -- app/version.sh@17 -- # get_header_version major 00:07:59.858 04:24:38 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:59.858 04:24:38 version -- app/version.sh@14 -- # cut -f2 00:07:59.858 04:24:38 version -- app/version.sh@14 -- # tr -d '"' 00:07:59.858 04:24:38 version -- app/version.sh@17 -- # major=25 00:07:59.858 04:24:38 version -- app/version.sh@18 -- # get_header_version minor 00:07:59.858 04:24:38 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:59.858 04:24:38 version -- app/version.sh@14 -- # cut -f2 00:07:59.858 04:24:38 version -- app/version.sh@14 -- # tr -d '"' 00:07:59.858 04:24:38 version -- app/version.sh@18 -- # minor=1 00:07:59.858 04:24:38 version -- app/version.sh@19 -- # get_header_version patch 00:07:59.858 04:24:38 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:59.858 04:24:38 version -- app/version.sh@14 -- # cut -f2 00:07:59.858 04:24:38 version -- app/version.sh@14 -- # tr -d '"' 00:07:59.858 04:24:38 version -- app/version.sh@19 -- # patch=0 00:07:59.858 04:24:38 version -- app/version.sh@20 -- # get_header_version suffix 00:07:59.858 04:24:38 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:59.858 04:24:38 version -- app/version.sh@14 -- # cut -f2 00:07:59.858 04:24:38 version -- app/version.sh@14 -- # tr -d '"' 00:07:59.858 04:24:38 version -- app/version.sh@20 -- # suffix=-pre 00:07:59.858 04:24:38 version -- app/version.sh@22 -- # version=25.1 00:07:59.858 04:24:38 version -- app/version.sh@25 -- # (( patch != 0 )) 00:07:59.858 04:24:38 version -- app/version.sh@28 -- # version=25.1rc0 00:07:59.858 04:24:38 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:59.858 04:24:38 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:59.858 04:24:38 version -- app/version.sh@30 -- # py_version=25.1rc0 00:07:59.858 04:24:38 version -- app/version.sh@31 -- # [[ 25.1rc0 == \2\5\.\1\r\c\0 ]] 00:07:59.858 00:07:59.858 real 0m0.275s 00:07:59.858 user 0m0.166s 00:07:59.858 sys 0m0.164s 00:07:59.858 04:24:38 version -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:59.858 04:24:38 version -- common/autotest_common.sh@10 -- # set +x 00:07:59.858 ************************************ 00:07:59.858 END TEST version 00:07:59.858 ************************************ 00:08:00.119 04:24:38 -- spdk/autotest.sh@179 -- # '[' 0 -eq 1 ']' 00:08:00.119 04:24:38 -- spdk/autotest.sh@188 -- # [[ 0 -eq 1 ]] 00:08:00.119 04:24:38 -- spdk/autotest.sh@194 -- # uname -s 00:08:00.119 04:24:38 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:08:00.119 04:24:38 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:08:00.119 04:24:38 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:08:00.119 04:24:38 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:08:00.119 04:24:38 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:08:00.119 04:24:38 -- spdk/autotest.sh@260 -- # timing_exit lib 00:08:00.119 04:24:38 -- common/autotest_common.sh@732 -- # xtrace_disable 00:08:00.119 04:24:38 -- common/autotest_common.sh@10 -- # set +x 00:08:00.119 04:24:38 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:08:00.119 04:24:38 -- spdk/autotest.sh@267 -- # '[' 0 -eq 1 ']' 00:08:00.119 04:24:38 -- spdk/autotest.sh@276 -- # '[' 0 -eq 1 ']' 00:08:00.119 04:24:38 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:08:00.119 04:24:38 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:08:00.119 04:24:38 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:08:00.119 04:24:38 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:08:00.119 04:24:38 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:08:00.119 04:24:38 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:08:00.119 04:24:38 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:08:00.119 04:24:38 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:08:00.119 04:24:38 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:08:00.119 04:24:38 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:08:00.119 04:24:38 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:08:00.119 04:24:38 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:08:00.119 04:24:38 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:08:00.119 04:24:38 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:08:00.119 04:24:38 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:08:00.119 04:24:38 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:00.119 04:24:38 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:00.119 04:24:38 -- common/autotest_common.sh@10 -- # set +x 00:08:00.119 ************************************ 00:08:00.119 START TEST llvm_fuzz 00:08:00.119 ************************************ 00:08:00.119 04:24:38 llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:08:00.119 * Looking for test storage... 00:08:00.120 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:08:00.120 04:24:38 llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:00.120 04:24:38 llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:00.120 04:24:38 llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:08:00.380 04:24:39 llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:00.380 04:24:39 llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:00.380 04:24:39 llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:00.380 04:24:39 llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:00.380 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.380 --rc genhtml_branch_coverage=1 00:08:00.380 --rc genhtml_function_coverage=1 00:08:00.380 --rc genhtml_legend=1 00:08:00.380 --rc geninfo_all_blocks=1 00:08:00.380 --rc geninfo_unexecuted_blocks=1 00:08:00.380 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.380 ' 00:08:00.380 04:24:39 llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:00.380 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.380 --rc genhtml_branch_coverage=1 00:08:00.380 --rc genhtml_function_coverage=1 00:08:00.380 --rc genhtml_legend=1 00:08:00.380 --rc geninfo_all_blocks=1 00:08:00.380 --rc geninfo_unexecuted_blocks=1 00:08:00.380 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.380 ' 00:08:00.380 04:24:39 llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:00.380 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.380 --rc genhtml_branch_coverage=1 00:08:00.380 --rc genhtml_function_coverage=1 00:08:00.380 --rc genhtml_legend=1 00:08:00.380 --rc geninfo_all_blocks=1 00:08:00.380 --rc geninfo_unexecuted_blocks=1 00:08:00.380 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.380 ' 00:08:00.380 04:24:39 llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:00.380 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.380 --rc genhtml_branch_coverage=1 00:08:00.380 --rc genhtml_function_coverage=1 00:08:00.380 --rc genhtml_legend=1 00:08:00.380 --rc geninfo_all_blocks=1 00:08:00.380 --rc geninfo_unexecuted_blocks=1 00:08:00.380 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.380 ' 00:08:00.380 04:24:39 llvm_fuzz -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:08:00.380 04:24:39 llvm_fuzz -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:08:00.380 04:24:39 llvm_fuzz -- common/autotest_common.sh@550 -- # fuzzers=() 00:08:00.380 04:24:39 llvm_fuzz -- common/autotest_common.sh@550 -- # local fuzzers 00:08:00.380 04:24:39 llvm_fuzz -- common/autotest_common.sh@552 -- # [[ -n '' ]] 00:08:00.380 04:24:39 llvm_fuzz -- common/autotest_common.sh@555 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:08:00.380 04:24:39 llvm_fuzz -- common/autotest_common.sh@556 -- # fuzzers=("${fuzzers[@]##*/}") 00:08:00.380 04:24:39 llvm_fuzz -- common/autotest_common.sh@559 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:08:00.380 04:24:39 llvm_fuzz -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:08:00.380 04:24:39 llvm_fuzz -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:08:00.380 04:24:39 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:00.380 04:24:39 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:00.380 04:24:39 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:00.380 04:24:39 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:00.380 04:24:39 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:00.380 04:24:39 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:00.380 04:24:39 llvm_fuzz -- fuzz/llvm.sh@19 -- # run_test nvmf_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:08:00.380 04:24:39 llvm_fuzz -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:00.380 04:24:39 llvm_fuzz -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:00.380 04:24:39 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:00.380 ************************************ 00:08:00.380 START TEST nvmf_llvm_fuzz 00:08:00.380 ************************************ 00:08:00.380 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:08:00.380 * Looking for test storage... 00:08:00.380 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:00.380 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:00.380 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:08:00.380 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:00.644 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:00.644 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:00.644 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:00.644 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:00.644 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:00.644 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:00.644 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:00.644 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:00.644 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:00.644 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:00.645 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.645 --rc genhtml_branch_coverage=1 00:08:00.645 --rc genhtml_function_coverage=1 00:08:00.645 --rc genhtml_legend=1 00:08:00.645 --rc geninfo_all_blocks=1 00:08:00.645 --rc geninfo_unexecuted_blocks=1 00:08:00.645 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.645 ' 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:00.645 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.645 --rc genhtml_branch_coverage=1 00:08:00.645 --rc genhtml_function_coverage=1 00:08:00.645 --rc genhtml_legend=1 00:08:00.645 --rc geninfo_all_blocks=1 00:08:00.645 --rc geninfo_unexecuted_blocks=1 00:08:00.645 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.645 ' 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:00.645 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.645 --rc genhtml_branch_coverage=1 00:08:00.645 --rc genhtml_function_coverage=1 00:08:00.645 --rc genhtml_legend=1 00:08:00.645 --rc geninfo_all_blocks=1 00:08:00.645 --rc geninfo_unexecuted_blocks=1 00:08:00.645 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.645 ' 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:00.645 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.645 --rc genhtml_branch_coverage=1 00:08:00.645 --rc genhtml_function_coverage=1 00:08:00.645 --rc genhtml_legend=1 00:08:00.645 --rc geninfo_all_blocks=1 00:08:00.645 --rc geninfo_unexecuted_blocks=1 00:08:00.645 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.645 ' 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@60 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_MAX_NUMA_NODES=1 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_PGO_CAPTURE=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_LTO=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_ISCSI_INITIATOR=y 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_CET=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_OCF_PATH= 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_RDMA_SET_TOS=y 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_AIO_FSDEV=y 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_UBLK=y 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_ISAL_CRYPTO=y 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OPENSSL_PATH= 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_OCF=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_FUSE=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_VTUNE_DIR= 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FUZZER=y 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_FSDEV=y 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_CRYPTO=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_PGO_USE=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_VHOST=y 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DAOS=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_DAOS_DIR= 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_UNIT_TESTS=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_VIRTIO=y 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_DPDK_UADK=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_COVERAGE=y 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_RDMA=y 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_HAVE_LZ4=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_URING_PATH= 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_XNVME=n 00:08:00.645 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_VFIO_USER=y 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_ARCH=native 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_HAVE_EVP_MAC=y 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_URING_ZNS=n 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_WERROR=y 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_HAVE_LIBBSD=n 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_UBSAN=y 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_IPSEC_MB_DIR= 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_GOLANG=n 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_ISAL=y 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_IDXD_KERNEL=y 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_RDMA_PROV=verbs 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_APPS=y 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_SHARED=n 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_HAVE_KEYUTILS=y 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_FC_PATH= 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_FC=n 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_AVAHI=n 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_FIO_PLUGIN=y 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_RAID5F=n 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_EXAMPLES=y 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_TESTS=y 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_CRYPTO_MLX5=n 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_MAX_LCORES=128 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_IPSEC_MB=n 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_PGO_DIR= 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DEBUG=y 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_CROSS_PREFIX= 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_COPY_FILE_RANGE=y 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@90 -- # CONFIG_URING=n 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:00.646 #define SPDK_CONFIG_H 00:08:00.646 #define SPDK_CONFIG_AIO_FSDEV 1 00:08:00.646 #define SPDK_CONFIG_APPS 1 00:08:00.646 #define SPDK_CONFIG_ARCH native 00:08:00.646 #undef SPDK_CONFIG_ASAN 00:08:00.646 #undef SPDK_CONFIG_AVAHI 00:08:00.646 #undef SPDK_CONFIG_CET 00:08:00.646 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:08:00.646 #define SPDK_CONFIG_COVERAGE 1 00:08:00.646 #define SPDK_CONFIG_CROSS_PREFIX 00:08:00.646 #undef SPDK_CONFIG_CRYPTO 00:08:00.646 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:00.646 #undef SPDK_CONFIG_CUSTOMOCF 00:08:00.646 #undef SPDK_CONFIG_DAOS 00:08:00.646 #define SPDK_CONFIG_DAOS_DIR 00:08:00.646 #define SPDK_CONFIG_DEBUG 1 00:08:00.646 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:00.646 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:00.646 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:00.646 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:00.646 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:00.646 #undef SPDK_CONFIG_DPDK_UADK 00:08:00.646 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:00.646 #define SPDK_CONFIG_EXAMPLES 1 00:08:00.646 #undef SPDK_CONFIG_FC 00:08:00.646 #define SPDK_CONFIG_FC_PATH 00:08:00.646 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:00.646 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:00.646 #define SPDK_CONFIG_FSDEV 1 00:08:00.646 #undef SPDK_CONFIG_FUSE 00:08:00.646 #define SPDK_CONFIG_FUZZER 1 00:08:00.646 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:00.646 #undef SPDK_CONFIG_GOLANG 00:08:00.646 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:00.646 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:08:00.646 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:00.646 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:08:00.646 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:00.646 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:00.646 #undef SPDK_CONFIG_HAVE_LZ4 00:08:00.646 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:08:00.646 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:08:00.646 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:00.646 #define SPDK_CONFIG_IDXD 1 00:08:00.646 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:00.646 #undef SPDK_CONFIG_IPSEC_MB 00:08:00.646 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:00.646 #define SPDK_CONFIG_ISAL 1 00:08:00.646 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:00.646 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:00.646 #define SPDK_CONFIG_LIBDIR 00:08:00.646 #undef SPDK_CONFIG_LTO 00:08:00.646 #define SPDK_CONFIG_MAX_LCORES 128 00:08:00.646 #define SPDK_CONFIG_MAX_NUMA_NODES 1 00:08:00.646 #define SPDK_CONFIG_NVME_CUSE 1 00:08:00.646 #undef SPDK_CONFIG_OCF 00:08:00.646 #define SPDK_CONFIG_OCF_PATH 00:08:00.646 #define SPDK_CONFIG_OPENSSL_PATH 00:08:00.646 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:00.646 #define SPDK_CONFIG_PGO_DIR 00:08:00.646 #undef SPDK_CONFIG_PGO_USE 00:08:00.646 #define SPDK_CONFIG_PREFIX /usr/local 00:08:00.646 #undef SPDK_CONFIG_RAID5F 00:08:00.646 #undef SPDK_CONFIG_RBD 00:08:00.646 #define SPDK_CONFIG_RDMA 1 00:08:00.646 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:00.646 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:00.646 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:00.646 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:00.646 #undef SPDK_CONFIG_SHARED 00:08:00.646 #undef SPDK_CONFIG_SMA 00:08:00.646 #define SPDK_CONFIG_TESTS 1 00:08:00.646 #undef SPDK_CONFIG_TSAN 00:08:00.646 #define SPDK_CONFIG_UBLK 1 00:08:00.646 #define SPDK_CONFIG_UBSAN 1 00:08:00.646 #undef SPDK_CONFIG_UNIT_TESTS 00:08:00.646 #undef SPDK_CONFIG_URING 00:08:00.646 #define SPDK_CONFIG_URING_PATH 00:08:00.646 #undef SPDK_CONFIG_URING_ZNS 00:08:00.646 #undef SPDK_CONFIG_USDT 00:08:00.646 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:00.646 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:00.646 #define SPDK_CONFIG_VFIO_USER 1 00:08:00.646 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:00.646 #define SPDK_CONFIG_VHOST 1 00:08:00.646 #define SPDK_CONFIG_VIRTIO 1 00:08:00.646 #undef SPDK_CONFIG_VTUNE 00:08:00.646 #define SPDK_CONFIG_VTUNE_DIR 00:08:00.646 #define SPDK_CONFIG_WERROR 1 00:08:00.646 #define SPDK_CONFIG_WPDK_DIR 00:08:00.646 #undef SPDK_CONFIG_XNVME 00:08:00.646 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:00.646 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # uname -s 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@140 -- # : v22.11.4 00:08:00.647 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@177 -- # : 0 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@178 -- # export SPDK_TEST_NVME_INTERRUPT 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@183 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@183 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@184 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@184 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@187 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@187 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@191 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@191 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@195 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@195 -- # PYTHONDONTWRITEBYTECODE=1 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@199 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@199 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@200 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@200 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@204 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@205 -- # rm -rf /var/tmp/asan_suppression_file 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@206 -- # cat 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@242 -- # echo leak:libfuse3.so 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@248 -- # '[' -z /var/spdk/dependencies ']' 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@251 -- # export DEPENDENCY_DIR 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@255 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@255 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@256 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@256 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@259 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@259 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@262 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@262 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV_MAIN=0 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV_LLVM=1 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@269 -- # _LCOV= 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ '' == *clang* ]] 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ 1 -eq 1 ]] 00:08:00.648 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # _LCOV=1 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@272 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@273 -- # _lcov_opt[_LCOV_MAIN]= 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@275 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@278 -- # '[' 0 -eq 0 ']' 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@279 -- # export valgrind= 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@279 -- # valgrind= 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # uname -s 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # '[' Linux = Linux ']' 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@286 -- # HUGEMEM=4096 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # export CLEAR_HUGE=yes 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # CLEAR_HUGE=yes 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@289 -- # MAKE=make 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@290 -- # MAKEFLAGS=-j112 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # export HUGEMEM=4096 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # HUGEMEM=4096 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@308 -- # NO_HUGE=() 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@309 -- # TEST_MODE= 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@331 -- # [[ -z 148412 ]] 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@331 -- # kill -0 148412 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1678 -- # set_test_storage 2147483648 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@341 -- # [[ -v testdir ]] 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@343 -- # local requested_size=2147483648 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@344 -- # local mount target_dir 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@346 -- # local -A mounts fss sizes avails uses 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@347 -- # local source fs size avail mount use 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@349 -- # local storage_fallback storage_candidates 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@351 -- # mktemp -udt spdk.XXXXXX 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@351 -- # storage_fallback=/tmp/spdk.O85irM 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@356 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@358 -- # [[ -n '' ]] 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@363 -- # [[ -n '' ]] 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@368 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.O85irM/tests/nvmf /tmp/spdk.O85irM 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # requested_size=2214592512 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@340 -- # df -T 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@340 -- # grep -v Filesystem 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_devtmpfs 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=devtmpfs 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=67108864 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=67108864 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=0 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/pmem0 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=ext2 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=4096 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=5284429824 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5284425728 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_root 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=overlay 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=53106348032 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=61730594816 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=8624246784 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30861869056 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865297408 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=3428352 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=12340125696 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=12346122240 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5996544 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30865121280 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865297408 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=176128 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=6173044736 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=6173057024 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=12288 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@379 -- # printf '* Looking for test storage...\n' 00:08:00.649 * Looking for test storage... 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@381 -- # local target_space new_size 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@382 -- # for target_dir in "${storage_candidates[@]}" 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # mount=/ 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@387 -- # target_space=53106348032 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@388 -- # (( target_space == 0 || target_space < requested_size )) 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # (( target_space >= requested_size )) 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == tmpfs ]] 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == ramfs ]] 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ / == / ]] 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@394 -- # new_size=10838839296 00:08:00.649 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@395 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@401 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:00.650 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@402 -- # return 0 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1680 -- # set -o errtrace 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # shopt -s extdebug 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1682 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1684 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1685 -- # true 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1687 -- # xtrace_fd 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:08:00.650 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:00.910 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.910 --rc genhtml_branch_coverage=1 00:08:00.910 --rc genhtml_function_coverage=1 00:08:00.910 --rc genhtml_legend=1 00:08:00.910 --rc geninfo_all_blocks=1 00:08:00.910 --rc geninfo_unexecuted_blocks=1 00:08:00.910 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.910 ' 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:00.910 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.910 --rc genhtml_branch_coverage=1 00:08:00.910 --rc genhtml_function_coverage=1 00:08:00.910 --rc genhtml_legend=1 00:08:00.910 --rc geninfo_all_blocks=1 00:08:00.910 --rc geninfo_unexecuted_blocks=1 00:08:00.910 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.910 ' 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:00.910 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.910 --rc genhtml_branch_coverage=1 00:08:00.910 --rc genhtml_function_coverage=1 00:08:00.910 --rc genhtml_legend=1 00:08:00.910 --rc geninfo_all_blocks=1 00:08:00.910 --rc geninfo_unexecuted_blocks=1 00:08:00.910 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.910 ' 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:00.910 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.910 --rc genhtml_branch_coverage=1 00:08:00.910 --rc genhtml_function_coverage=1 00:08:00.910 --rc genhtml_legend=1 00:08:00.910 --rc geninfo_all_blocks=1 00:08:00.910 --rc geninfo_unexecuted_blocks=1 00:08:00.910 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.910 ' 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@61 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@63 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # fuzz_num=25 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@65 -- # (( fuzz_num != 0 )) 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@67 -- # trap 'cleanup /tmp/llvm_fuzz* /var/tmp/suppress_nvmf_fuzz; exit 1' SIGINT SIGTERM EXIT 00:08:00.910 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@69 -- # mem_size=512 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@70 -- # [[ 1 -eq 1 ]] 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@71 -- # start_llvm_fuzz_short 25 1 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=25 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 0 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4400 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:00.911 04:24:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 00:08:00.911 [2024-11-17 04:24:39.595942] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:00.911 [2024-11-17 04:24:39.596024] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid148622 ] 00:08:01.172 [2024-11-17 04:24:39.818556] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.172 [2024-11-17 04:24:39.831168] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.172 [2024-11-17 04:24:39.883770] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:01.172 [2024-11-17 04:24:39.900124] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:08:01.172 INFO: Running with entropic power schedule (0xFF, 100). 00:08:01.172 INFO: Seed: 1501369883 00:08:01.172 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:01.172 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:01.172 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:08:01.172 INFO: A corpus is not provided, starting from an empty corpus 00:08:01.172 #2 INITED exec/s: 0 rss: 66Mb 00:08:01.172 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:01.172 This may also happen if the target rejected all inputs we tried so far 00:08:01.172 [2024-11-17 04:24:39.965557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b0) qid:0 cid:4 nsid:b0b0b0b0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 00:08:01.172 [2024-11-17 04:24:39.965584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.692 NEW_FUNC[1/714]: 0x452788 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:08:01.692 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:01.692 #18 NEW cov: 12137 ft: 12137 corp: 2/123b lim: 320 exec/s: 0 rss: 73Mb L: 122/122 MS: 1 InsertRepeatedBytes- 00:08:01.692 [2024-11-17 04:24:40.326624] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:01.692 [2024-11-17 04:24:40.326690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.692 NEW_FUNC[1/1]: 0x197f838 in nvme_get_sgl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:159 00:08:01.692 #29 NEW cov: 12320 ft: 13007 corp: 3/246b lim: 320 exec/s: 0 rss: 73Mb L: 123/123 MS: 1 CrossOver- 00:08:01.692 [2024-11-17 04:24:40.376468] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:01.692 [2024-11-17 04:24:40.376498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.692 #30 NEW cov: 12326 ft: 13299 corp: 4/369b lim: 320 exec/s: 0 rss: 73Mb L: 123/123 MS: 1 CrossOver- 00:08:01.692 [2024-11-17 04:24:40.436654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b0) qid:0 cid:4 nsid:b0b0b0b0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 00:08:01.692 [2024-11-17 04:24:40.436684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.692 #31 NEW cov: 12411 ft: 13652 corp: 5/491b lim: 320 exec/s: 0 rss: 73Mb L: 122/123 MS: 1 ChangeBit- 00:08:01.692 [2024-11-17 04:24:40.496845] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:01.692 [2024-11-17 04:24:40.496872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.952 #32 NEW cov: 12411 ft: 13764 corp: 6/615b lim: 320 exec/s: 0 rss: 73Mb L: 124/124 MS: 1 InsertByte- 00:08:01.952 [2024-11-17 04:24:40.556963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b0) qid:0 cid:4 nsid:b0b0b0b0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 00:08:01.952 [2024-11-17 04:24:40.556991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.952 #33 NEW cov: 12411 ft: 13825 corp: 7/737b lim: 320 exec/s: 0 rss: 73Mb L: 122/124 MS: 1 ChangeBit- 00:08:01.952 [2024-11-17 04:24:40.617128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b0) qid:0 cid:4 nsid:b0b0b0b0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 00:08:01.952 [2024-11-17 04:24:40.617154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.952 #34 NEW cov: 12411 ft: 13934 corp: 8/859b lim: 320 exec/s: 0 rss: 73Mb L: 122/124 MS: 1 ChangeBit- 00:08:01.952 [2024-11-17 04:24:40.657235] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:01.952 [2024-11-17 04:24:40.657261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.952 #35 NEW cov: 12411 ft: 13992 corp: 9/984b lim: 320 exec/s: 0 rss: 73Mb L: 125/125 MS: 1 CopyPart- 00:08:01.952 [2024-11-17 04:24:40.717381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b0) qid:0 cid:4 nsid:b0b0b0b0 cdw10:b0b0b0b0 cdw11:a0b0b0b0 00:08:01.952 [2024-11-17 04:24:40.717407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.952 #36 NEW cov: 12411 ft: 14092 corp: 10/1106b lim: 320 exec/s: 0 rss: 73Mb L: 122/125 MS: 1 ChangeByte- 00:08:01.952 [2024-11-17 04:24:40.777572] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:01.952 [2024-11-17 04:24:40.777599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.213 #37 NEW cov: 12411 ft: 14123 corp: 11/1220b lim: 320 exec/s: 0 rss: 74Mb L: 114/125 MS: 1 EraseBytes- 00:08:02.213 [2024-11-17 04:24:40.817781] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:02.213 [2024-11-17 04:24:40.817807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.213 [2024-11-17 04:24:40.817861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b0) qid:0 cid:5 nsid:b0b0b0b0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 00:08:02.213 [2024-11-17 04:24:40.817875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.213 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:02.213 #38 NEW cov: 12434 ft: 14299 corp: 12/1372b lim: 320 exec/s: 0 rss: 74Mb L: 152/152 MS: 1 CrossOver- 00:08:02.213 [2024-11-17 04:24:40.857914] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:02.213 [2024-11-17 04:24:40.857939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.213 [2024-11-17 04:24:40.857992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b0) qid:0 cid:5 nsid:b0b0b0b0 cdw10:7cb0b0b0 cdw11:b0b0b0b0 00:08:02.213 [2024-11-17 04:24:40.858007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.213 #44 NEW cov: 12434 ft: 14326 corp: 13/1544b lim: 320 exec/s: 0 rss: 74Mb L: 172/172 MS: 1 CopyPart- 00:08:02.213 [2024-11-17 04:24:40.918108] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b00098 00:08:02.213 [2024-11-17 04:24:40.918133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.213 [2024-11-17 04:24:40.918186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b0) qid:0 cid:5 nsid:b0b0b0b0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 00:08:02.213 [2024-11-17 04:24:40.918200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.213 #45 NEW cov: 12434 ft: 14349 corp: 14/1696b lim: 320 exec/s: 45 rss: 74Mb L: 152/172 MS: 1 ChangeBinInt- 00:08:02.213 [2024-11-17 04:24:40.978127] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:02.213 [2024-11-17 04:24:40.978163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.213 #46 NEW cov: 12434 ft: 14386 corp: 15/1810b lim: 320 exec/s: 46 rss: 74Mb L: 114/172 MS: 1 ChangeByte- 00:08:02.213 [2024-11-17 04:24:41.038397] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:02.213 [2024-11-17 04:24:41.038423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.213 [2024-11-17 04:24:41.038475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b0) qid:0 cid:5 nsid:b0b0b0b0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 00:08:02.213 [2024-11-17 04:24:41.038488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.473 #47 NEW cov: 12434 ft: 14399 corp: 16/1956b lim: 320 exec/s: 47 rss: 74Mb L: 146/172 MS: 1 CopyPart- 00:08:02.473 [2024-11-17 04:24:41.078516] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x7f7f7f7f7f7f7f7f 00:08:02.473 [2024-11-17 04:24:41.078541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.473 [2024-11-17 04:24:41.078597] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:b0b0b0b0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.473 [2024-11-17 04:24:41.078613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.473 NEW_FUNC[1/1]: 0x19803a8 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:08:02.473 #48 NEW cov: 12457 ft: 14852 corp: 17/2127b lim: 320 exec/s: 48 rss: 74Mb L: 171/172 MS: 1 InsertRepeatedBytes- 00:08:02.473 [2024-11-17 04:24:41.138503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b0) qid:0 cid:4 nsid:b0b0b0b0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 00:08:02.473 [2024-11-17 04:24:41.138527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.473 #49 NEW cov: 12457 ft: 14898 corp: 18/2219b lim: 320 exec/s: 49 rss: 74Mb L: 92/172 MS: 1 EraseBytes- 00:08:02.473 [2024-11-17 04:24:41.178769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b0) qid:0 cid:4 nsid:b0b0b0b0 cdw10:b0b0b0b0 cdw11:a0b0b0b0 00:08:02.473 [2024-11-17 04:24:41.178794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.473 [2024-11-17 04:24:41.178860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b0) qid:0 cid:5 nsid:b0b0b0b0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 00:08:02.474 [2024-11-17 04:24:41.178874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.474 #50 NEW cov: 12457 ft: 14962 corp: 19/2362b lim: 320 exec/s: 50 rss: 74Mb L: 143/172 MS: 1 CrossOver- 00:08:02.474 [2024-11-17 04:24:41.238950] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:02.474 [2024-11-17 04:24:41.238974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.474 [2024-11-17 04:24:41.239026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b0) qid:0 cid:5 nsid:b0b0b0b0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 00:08:02.474 [2024-11-17 04:24:41.239039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.474 #51 NEW cov: 12457 ft: 14978 corp: 20/2514b lim: 320 exec/s: 51 rss: 74Mb L: 152/172 MS: 1 CMP- DE: "\377\377"- 00:08:02.474 [2024-11-17 04:24:41.278908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:b0b0b0b0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb4b0b0b0b0b0b0b0 00:08:02.474 [2024-11-17 04:24:41.278934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.734 NEW_FUNC[1/1]: 0x1549a28 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2213 00:08:02.734 #53 NEW cov: 12488 ft: 15022 corp: 21/2609b lim: 320 exec/s: 53 rss: 74Mb L: 95/172 MS: 2 PersAutoDict-CrossOver- DE: "\377\377"- 00:08:02.734 [2024-11-17 04:24:41.319082] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:02.734 [2024-11-17 04:24:41.319110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.734 #54 NEW cov: 12488 ft: 15042 corp: 22/2732b lim: 320 exec/s: 54 rss: 74Mb L: 123/172 MS: 1 ChangeByte- 00:08:02.734 [2024-11-17 04:24:41.359215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b0) qid:0 cid:4 nsid:b0b0b0b0 cdw10:ffffffff cdw11:ffffffff 00:08:02.734 [2024-11-17 04:24:41.359240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.734 [2024-11-17 04:24:41.359295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:b0b0b0b0 cdw11:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:02.734 [2024-11-17 04:24:41.359309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.734 #55 NEW cov: 12488 ft: 15067 corp: 23/2891b lim: 320 exec/s: 55 rss: 74Mb L: 159/172 MS: 1 InsertRepeatedBytes- 00:08:02.734 [2024-11-17 04:24:41.399378] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:02.734 [2024-11-17 04:24:41.399403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.734 [2024-11-17 04:24:41.399455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b0) qid:0 cid:5 nsid:b0b0b0b0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 00:08:02.734 [2024-11-17 04:24:41.399468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.734 #56 NEW cov: 12488 ft: 15078 corp: 24/3031b lim: 320 exec/s: 56 rss: 74Mb L: 140/172 MS: 1 EraseBytes- 00:08:02.734 [2024-11-17 04:24:41.459466] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:02.734 [2024-11-17 04:24:41.459492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.734 #57 NEW cov: 12488 ft: 15107 corp: 25/3145b lim: 320 exec/s: 57 rss: 74Mb L: 114/172 MS: 1 EraseBytes- 00:08:02.734 [2024-11-17 04:24:41.499558] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:02.734 [2024-11-17 04:24:41.499582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.734 #58 NEW cov: 12488 ft: 15159 corp: 26/3235b lim: 320 exec/s: 58 rss: 74Mb L: 90/172 MS: 1 EraseBytes- 00:08:02.734 [2024-11-17 04:24:41.539770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b0) qid:0 cid:4 nsid:b0b0b0b0 cdw10:b0b0b0b0 cdw11:a0b0b0b0 00:08:02.734 [2024-11-17 04:24:41.539795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.734 [2024-11-17 04:24:41.539848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2c) qid:0 cid:5 nsid:b0b0b0b0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 00:08:02.734 [2024-11-17 04:24:41.539865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.994 #59 NEW cov: 12488 ft: 15174 corp: 27/3386b lim: 320 exec/s: 59 rss: 74Mb L: 151/172 MS: 1 CMP- DE: "\000\212v\206,; \002"- 00:08:02.994 [2024-11-17 04:24:41.599895] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:02.994 [2024-11-17 04:24:41.599921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.994 #60 NEW cov: 12488 ft: 15185 corp: 28/3510b lim: 320 exec/s: 60 rss: 74Mb L: 124/172 MS: 1 ChangeByte- 00:08:02.994 [2024-11-17 04:24:41.639951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b0) qid:0 cid:4 nsid:b0b0b0b0 cdw10:b0b0b0b0 cdw11:a0b0b0b0 00:08:02.994 [2024-11-17 04:24:41.639975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.994 #61 NEW cov: 12488 ft: 15198 corp: 29/3632b lim: 320 exec/s: 61 rss: 74Mb L: 122/172 MS: 1 ShuffleBytes- 00:08:02.994 [2024-11-17 04:24:41.680107] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:02.994 [2024-11-17 04:24:41.680133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.994 #62 NEW cov: 12488 ft: 15213 corp: 30/3722b lim: 320 exec/s: 62 rss: 75Mb L: 90/172 MS: 1 ChangeByte- 00:08:02.994 [2024-11-17 04:24:41.740281] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:02.994 [2024-11-17 04:24:41.740305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.994 #63 NEW cov: 12488 ft: 15231 corp: 31/3846b lim: 320 exec/s: 63 rss: 75Mb L: 124/172 MS: 1 ShuffleBytes- 00:08:02.994 [2024-11-17 04:24:41.800420] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:02.994 [2024-11-17 04:24:41.800445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.255 #64 NEW cov: 12488 ft: 15253 corp: 32/3936b lim: 320 exec/s: 64 rss: 75Mb L: 90/172 MS: 1 PersAutoDict- DE: "\000\212v\206,; \002"- 00:08:03.255 [2024-11-17 04:24:41.860725] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:03.255 [2024-11-17 04:24:41.860750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.255 [2024-11-17 04:24:41.860805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b0) qid:0 cid:5 nsid:b0b0b0b0 cdw10:7cb0b0b0 cdw11:b0b0b0b0 00:08:03.255 [2024-11-17 04:24:41.860818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.255 #65 NEW cov: 12488 ft: 15256 corp: 33/4108b lim: 320 exec/s: 65 rss: 75Mb L: 172/172 MS: 1 PersAutoDict- DE: "\000\212v\206,; \002"- 00:08:03.255 [2024-11-17 04:24:41.900673] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b0b0b0 00:08:03.255 [2024-11-17 04:24:41.900702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.255 #66 NEW cov: 12488 ft: 15265 corp: 34/4222b lim: 320 exec/s: 66 rss: 75Mb L: 114/172 MS: 1 ChangeBit- 00:08:03.255 [2024-11-17 04:24:41.941057] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b0b0b0b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb0b0b0b0b0b00098 00:08:03.255 [2024-11-17 04:24:41.941082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.255 [2024-11-17 04:24:41.941138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b0) qid:0 cid:5 nsid:b0b0b0b0 cdw10:b0b0b0b0 cdw11:b0b0b0b0 00:08:03.255 [2024-11-17 04:24:41.941152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.255 [2024-11-17 04:24:41.941205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (fa) qid:0 cid:6 nsid:fafafafa cdw10:fafafafa cdw11:fafafafa SGL TRANSPORT DATA BLOCK TRANSPORT 0xfafafafafafafafa 00:08:03.255 [2024-11-17 04:24:41.941219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.255 #67 NEW cov: 12488 ft: 15400 corp: 35/4439b lim: 320 exec/s: 33 rss: 75Mb L: 217/217 MS: 1 InsertRepeatedBytes- 00:08:03.255 #67 DONE cov: 12488 ft: 15400 corp: 35/4439b lim: 320 exec/s: 33 rss: 75Mb 00:08:03.255 ###### Recommended dictionary. ###### 00:08:03.255 "\377\377" # Uses: 1 00:08:03.255 "\000\212v\206,; \002" # Uses: 2 00:08:03.255 ###### End of recommended dictionary. ###### 00:08:03.255 Done 67 runs in 2 second(s) 00:08:03.255 04:24:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_0.conf /var/tmp/suppress_nvmf_fuzz 00:08:03.255 04:24:42 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:03.255 04:24:42 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:03.255 04:24:42 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:03.255 04:24:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:08:03.255 04:24:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:03.255 04:24:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:03.255 04:24:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:08:03.515 04:24:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:08:03.516 04:24:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:03.516 04:24:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:03.516 04:24:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 1 00:08:03.516 04:24:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4401 00:08:03.516 04:24:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:08:03.516 04:24:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:08:03.516 04:24:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:03.516 04:24:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:03.516 04:24:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:03.516 04:24:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 00:08:03.516 [2024-11-17 04:24:42.127170] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:03.516 [2024-11-17 04:24:42.127244] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid148929 ] 00:08:03.516 [2024-11-17 04:24:42.330242] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.516 [2024-11-17 04:24:42.343560] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.777 [2024-11-17 04:24:42.396215] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:03.777 [2024-11-17 04:24:42.412550] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:08:03.777 INFO: Running with entropic power schedule (0xFF, 100). 00:08:03.777 INFO: Seed: 4013374652 00:08:03.777 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:03.777 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:03.777 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:08:03.777 INFO: A corpus is not provided, starting from an empty corpus 00:08:03.777 #2 INITED exec/s: 0 rss: 65Mb 00:08:03.777 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:03.777 This may also happen if the target rejected all inputs we tried so far 00:08:03.777 [2024-11-17 04:24:42.488773] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:03.777 [2024-11-17 04:24:42.488979] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:03.777 [2024-11-17 04:24:42.489384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.777 [2024-11-17 04:24:42.489424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.777 [2024-11-17 04:24:42.489553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.777 [2024-11-17 04:24:42.489574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.037 NEW_FUNC[1/716]: 0x453088 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:08:04.037 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:04.037 #4 NEW cov: 12268 ft: 12252 corp: 2/16b lim: 30 exec/s: 0 rss: 72Mb L: 15/15 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:08:04.037 [2024-11-17 04:24:42.829692] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:04.037 [2024-11-17 04:24:42.829901] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:04.037 [2024-11-17 04:24:42.830073] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (448216) > buf size (4096) 00:08:04.037 [2024-11-17 04:24:42.830462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.037 [2024-11-17 04:24:42.830506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.037 [2024-11-17 04:24:42.830646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.037 [2024-11-17 04:24:42.830669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.037 [2024-11-17 04:24:42.830791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.037 [2024-11-17 04:24:42.830809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.296 #5 NEW cov: 12404 ft: 13359 corp: 3/34b lim: 30 exec/s: 0 rss: 72Mb L: 18/18 MS: 1 CrossOver- 00:08:04.296 [2024-11-17 04:24:42.899849] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:04.296 [2024-11-17 04:24:42.900045] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004ab5 00:08:04.296 [2024-11-17 04:24:42.900226] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (448216) > buf size (4096) 00:08:04.296 [2024-11-17 04:24:42.900644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.296 [2024-11-17 04:24:42.900673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.296 [2024-11-17 04:24:42.900795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b54f024a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.296 [2024-11-17 04:24:42.900815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.296 [2024-11-17 04:24:42.900936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.296 [2024-11-17 04:24:42.900954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.296 #6 NEW cov: 12410 ft: 13650 corp: 4/52b lim: 30 exec/s: 0 rss: 72Mb L: 18/18 MS: 1 ChangeBinInt- 00:08:04.296 [2024-11-17 04:24:42.969863] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xb5b5 00:08:04.296 [2024-11-17 04:24:42.970056] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:04.296 [2024-11-17 04:24:42.970468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.296 [2024-11-17 04:24:42.970497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.296 [2024-11-17 04:24:42.970614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.296 [2024-11-17 04:24:42.970630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.296 #12 NEW cov: 12495 ft: 14006 corp: 5/67b lim: 30 exec/s: 0 rss: 72Mb L: 15/18 MS: 1 ChangeBinInt- 00:08:04.296 [2024-11-17 04:24:43.020190] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:04.296 [2024-11-17 04:24:43.020377] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004ab5 00:08:04.296 [2024-11-17 04:24:43.020556] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (448216) > buf size (4096) 00:08:04.296 [2024-11-17 04:24:43.020950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.296 [2024-11-17 04:24:43.020979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.296 [2024-11-17 04:24:43.021097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b54a024f cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.296 [2024-11-17 04:24:43.021117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.296 [2024-11-17 04:24:43.021245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.296 [2024-11-17 04:24:43.021264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.296 #13 NEW cov: 12495 ft: 14084 corp: 6/85b lim: 30 exec/s: 0 rss: 72Mb L: 18/18 MS: 1 ShuffleBytes- 00:08:04.296 [2024-11-17 04:24:43.090374] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:04.296 [2024-11-17 04:24:43.090593] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:04.297 [2024-11-17 04:24:43.090789] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (297688) > buf size (4096) 00:08:04.297 [2024-11-17 04:24:43.091168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.297 [2024-11-17 04:24:43.091196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.297 [2024-11-17 04:24:43.091313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.297 [2024-11-17 04:24:43.091333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.297 [2024-11-17 04:24:43.091452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:22b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.297 [2024-11-17 04:24:43.091468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.297 #14 NEW cov: 12495 ft: 14173 corp: 7/103b lim: 30 exec/s: 0 rss: 72Mb L: 18/18 MS: 1 ChangeByte- 00:08:04.556 [2024-11-17 04:24:43.140554] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:04.556 [2024-11-17 04:24:43.140807] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b54a 00:08:04.556 [2024-11-17 04:24:43.141001] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (343340) > buf size (4096) 00:08:04.556 [2024-11-17 04:24:43.141380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.556 [2024-11-17 04:24:43.141407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.556 [2024-11-17 04:24:43.141534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b54a81b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.556 [2024-11-17 04:24:43.141552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.556 [2024-11-17 04:24:43.141673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:4f4a814a cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.556 [2024-11-17 04:24:43.141691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.556 #15 NEW cov: 12495 ft: 14217 corp: 8/121b lim: 30 exec/s: 0 rss: 73Mb L: 18/18 MS: 1 CopyPart- 00:08:04.556 [2024-11-17 04:24:43.210770] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:04.556 [2024-11-17 04:24:43.210971] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:04.556 [2024-11-17 04:24:43.211158] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (448216) > buf size (4096) 00:08:04.556 [2024-11-17 04:24:43.211526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.557 [2024-11-17 04:24:43.211554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.557 [2024-11-17 04:24:43.211670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.557 [2024-11-17 04:24:43.211689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.557 [2024-11-17 04:24:43.211811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.557 [2024-11-17 04:24:43.211829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.557 #16 NEW cov: 12495 ft: 14236 corp: 9/139b lim: 30 exec/s: 0 rss: 73Mb L: 18/18 MS: 1 ChangeBit- 00:08:04.557 [2024-11-17 04:24:43.260989] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:04.557 [2024-11-17 04:24:43.261176] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:04.557 [2024-11-17 04:24:43.261355] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004f4a 00:08:04.557 [2024-11-17 04:24:43.261532] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:04.557 [2024-11-17 04:24:43.261909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.557 [2024-11-17 04:24:43.261937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.557 [2024-11-17 04:24:43.262057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff81ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.557 [2024-11-17 04:24:43.262075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.557 [2024-11-17 04:24:43.262191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b5b502b5 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.557 [2024-11-17 04:24:43.262208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.557 [2024-11-17 04:24:43.262325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:4ab581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.557 [2024-11-17 04:24:43.262344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.557 #17 NEW cov: 12495 ft: 14749 corp: 10/165b lim: 30 exec/s: 0 rss: 73Mb L: 26/26 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:04.557 [2024-11-17 04:24:43.311109] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000b5b5 00:08:04.557 [2024-11-17 04:24:43.311305] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004ab5 00:08:04.557 [2024-11-17 04:24:43.311498] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (448216) > buf size (4096) 00:08:04.557 [2024-11-17 04:24:43.311892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b502b5 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.557 [2024-11-17 04:24:43.311920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.557 [2024-11-17 04:24:43.312036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b54a024f cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.557 [2024-11-17 04:24:43.312053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.557 [2024-11-17 04:24:43.312179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.557 [2024-11-17 04:24:43.312198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.557 #18 NEW cov: 12495 ft: 14819 corp: 11/183b lim: 30 exec/s: 0 rss: 73Mb L: 18/26 MS: 1 ChangeByte- 00:08:04.557 [2024-11-17 04:24:43.361364] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:04.557 [2024-11-17 04:24:43.361566] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:04.557 [2024-11-17 04:24:43.361773] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:04.557 [2024-11-17 04:24:43.361952] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:04.557 [2024-11-17 04:24:43.362351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b583ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.557 [2024-11-17 04:24:43.362380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.557 [2024-11-17 04:24:43.362498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.557 [2024-11-17 04:24:43.362515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.557 [2024-11-17 04:24:43.362645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b59581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.557 [2024-11-17 04:24:43.362664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.557 [2024-11-17 04:24:43.362765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.557 [2024-11-17 04:24:43.362782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.817 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:04.817 #19 NEW cov: 12518 ft: 14888 corp: 12/211b lim: 30 exec/s: 0 rss: 73Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:08:04.817 [2024-11-17 04:24:43.431306] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:04.817 [2024-11-17 04:24:43.431705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.817 [2024-11-17 04:24:43.431732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.817 #22 NEW cov: 12518 ft: 15215 corp: 13/220b lim: 30 exec/s: 22 rss: 73Mb L: 9/28 MS: 3 InsertByte-EraseBytes-PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:04.817 [2024-11-17 04:24:43.481546] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:04.817 [2024-11-17 04:24:43.481741] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:04.817 [2024-11-17 04:24:43.482140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5be81b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.817 [2024-11-17 04:24:43.482168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.817 [2024-11-17 04:24:43.482288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.817 [2024-11-17 04:24:43.482305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.817 #23 NEW cov: 12518 ft: 15228 corp: 14/235b lim: 30 exec/s: 23 rss: 73Mb L: 15/28 MS: 1 ChangeBinInt- 00:08:04.817 [2024-11-17 04:24:43.531671] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:04.817 [2024-11-17 04:24:43.531872] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:04.817 [2024-11-17 04:24:43.532256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.818 [2024-11-17 04:24:43.532285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.818 [2024-11-17 04:24:43.532409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.818 [2024-11-17 04:24:43.532425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.818 #24 NEW cov: 12518 ft: 15267 corp: 15/251b lim: 30 exec/s: 24 rss: 73Mb L: 16/28 MS: 1 InsertByte- 00:08:04.818 [2024-11-17 04:24:43.582015] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:04.818 [2024-11-17 04:24:43.582205] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:04.818 [2024-11-17 04:24:43.582389] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:04.818 [2024-11-17 04:24:43.582564] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffb5 00:08:04.818 [2024-11-17 04:24:43.582960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.818 [2024-11-17 04:24:43.582989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.818 [2024-11-17 04:24:43.583108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff81ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.818 [2024-11-17 04:24:43.583126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.818 [2024-11-17 04:24:43.583249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b5b583b5 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.818 [2024-11-17 04:24:43.583268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.818 [2024-11-17 04:24:43.583397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.818 [2024-11-17 04:24:43.583414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.818 #25 NEW cov: 12518 ft: 15304 corp: 16/277b lim: 30 exec/s: 25 rss: 73Mb L: 26/28 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:05.079 [2024-11-17 04:24:43.652262] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:05.079 [2024-11-17 04:24:43.652471] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:05.079 [2024-11-17 04:24:43.652649] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004f4a 00:08:05.079 [2024-11-17 04:24:43.652857] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:05.079 [2024-11-17 04:24:43.653245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.079 [2024-11-17 04:24:43.653274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.079 [2024-11-17 04:24:43.653401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffe381ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.079 [2024-11-17 04:24:43.653419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.079 [2024-11-17 04:24:43.653551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b5b502b5 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.079 [2024-11-17 04:24:43.653570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.079 [2024-11-17 04:24:43.653699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:4ab581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.079 [2024-11-17 04:24:43.653718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.079 #26 NEW cov: 12518 ft: 15439 corp: 17/303b lim: 30 exec/s: 26 rss: 73Mb L: 26/28 MS: 1 ChangeByte- 00:08:05.079 [2024-11-17 04:24:43.702515] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:05.079 [2024-11-17 04:24:43.702722] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:05.079 [2024-11-17 04:24:43.702899] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004f4a 00:08:05.079 [2024-11-17 04:24:43.703074] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:05.079 [2024-11-17 04:24:43.703486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.079 [2024-11-17 04:24:43.703521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.079 [2024-11-17 04:24:43.703642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffe381ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.079 [2024-11-17 04:24:43.703662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.079 [2024-11-17 04:24:43.703796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b5b502b5 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.079 [2024-11-17 04:24:43.703816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.079 [2024-11-17 04:24:43.703957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:4ab581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.079 [2024-11-17 04:24:43.703977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.079 #27 NEW cov: 12518 ft: 15583 corp: 18/329b lim: 30 exec/s: 27 rss: 73Mb L: 26/28 MS: 1 ChangeBit- 00:08:05.079 [2024-11-17 04:24:43.772712] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:05.079 [2024-11-17 04:24:43.772910] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524176) > buf size (4096) 00:08:05.079 [2024-11-17 04:24:43.773103] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:05.079 [2024-11-17 04:24:43.773293] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004ab5 00:08:05.079 [2024-11-17 04:24:43.773472] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (448152) > buf size (4096) 00:08:05.079 [2024-11-17 04:24:43.773858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.079 [2024-11-17 04:24:43.773890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.079 [2024-11-17 04:24:43.774004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffe381ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.080 [2024-11-17 04:24:43.774026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.080 [2024-11-17 04:24:43.774157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:000581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.080 [2024-11-17 04:24:43.774177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.080 [2024-11-17 04:24:43.774308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:b54a024f cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.080 [2024-11-17 04:24:43.774328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.080 [2024-11-17 04:24:43.774449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:b5a581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.080 [2024-11-17 04:24:43.774467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:05.080 #28 NEW cov: 12518 ft: 15686 corp: 19/359b lim: 30 exec/s: 28 rss: 73Mb L: 30/30 MS: 1 CMP- DE: "\000\000\000\005"- 00:08:05.080 [2024-11-17 04:24:43.842858] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:05.080 [2024-11-17 04:24:43.843055] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b54a 00:08:05.080 [2024-11-17 04:24:43.843232] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004f4a 00:08:05.080 [2024-11-17 04:24:43.843420] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:05.080 [2024-11-17 04:24:43.843830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.080 [2024-11-17 04:24:43.843862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.080 [2024-11-17 04:24:43.843995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff81b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.080 [2024-11-17 04:24:43.844015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.080 [2024-11-17 04:24:43.844141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b5b502b5 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.080 [2024-11-17 04:24:43.844158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.080 [2024-11-17 04:24:43.844282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:4ab581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.080 [2024-11-17 04:24:43.844300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.080 #29 NEW cov: 12518 ft: 15762 corp: 20/385b lim: 30 exec/s: 29 rss: 73Mb L: 26/30 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:05.340 [2024-11-17 04:24:43.912863] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:05.340 [2024-11-17 04:24:43.913302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5be81b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.340 [2024-11-17 04:24:43.913330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.340 #30 NEW cov: 12518 ft: 15779 corp: 21/394b lim: 30 exec/s: 30 rss: 73Mb L: 9/30 MS: 1 EraseBytes- 00:08:05.340 [2024-11-17 04:24:43.983213] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:05.340 [2024-11-17 04:24:43.983412] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004ab5 00:08:05.340 [2024-11-17 04:24:43.983601] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (448216) > buf size (4096) 00:08:05.340 [2024-11-17 04:24:43.984001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.340 [2024-11-17 04:24:43.984032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.340 [2024-11-17 04:24:43.984151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b54a024f cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.340 [2024-11-17 04:24:43.984171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.340 [2024-11-17 04:24:43.984299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.340 [2024-11-17 04:24:43.984319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.340 #31 NEW cov: 12518 ft: 15803 corp: 22/412b lim: 30 exec/s: 31 rss: 73Mb L: 18/30 MS: 1 ChangeBinInt- 00:08:05.340 [2024-11-17 04:24:44.033159] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x5ff 00:08:05.340 [2024-11-17 04:24:44.033600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.340 [2024-11-17 04:24:44.033629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.340 #35 NEW cov: 12518 ft: 15846 corp: 23/420b lim: 30 exec/s: 35 rss: 73Mb L: 8/30 MS: 4 EraseBytes-CopyPart-EraseBytes-PersAutoDict- DE: "\000\000\000\005"- 00:08:05.340 [2024-11-17 04:24:44.103448] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:05.340 [2024-11-17 04:24:44.103843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.340 [2024-11-17 04:24:44.103873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.340 #36 NEW cov: 12518 ft: 15867 corp: 24/427b lim: 30 exec/s: 36 rss: 74Mb L: 7/30 MS: 1 EraseBytes- 00:08:05.601 [2024-11-17 04:24:44.173548] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (448252) > buf size (4096) 00:08:05.601 [2024-11-17 04:24:44.174015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5be81b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.601 [2024-11-17 04:24:44.174045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.601 #37 NEW cov: 12518 ft: 15945 corp: 25/433b lim: 30 exec/s: 37 rss: 74Mb L: 6/30 MS: 1 EraseBytes- 00:08:05.601 [2024-11-17 04:24:44.224031] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:05.601 [2024-11-17 04:24:44.224230] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:05.601 [2024-11-17 04:24:44.224403] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004f4a 00:08:05.601 [2024-11-17 04:24:44.224580] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:05.601 [2024-11-17 04:24:44.224970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.601 [2024-11-17 04:24:44.224998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.601 [2024-11-17 04:24:44.225111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffe381ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.601 [2024-11-17 04:24:44.225128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.601 [2024-11-17 04:24:44.225263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b5b502b5 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.601 [2024-11-17 04:24:44.225281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.601 [2024-11-17 04:24:44.225417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:4aa581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.601 [2024-11-17 04:24:44.225434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.601 #38 NEW cov: 12518 ft: 15946 corp: 26/459b lim: 30 exec/s: 38 rss: 74Mb L: 26/30 MS: 1 ShuffleBytes- 00:08:05.601 [2024-11-17 04:24:44.274134] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:05.601 [2024-11-17 04:24:44.274329] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100006767 00:08:05.601 [2024-11-17 04:24:44.274500] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:05.601 [2024-11-17 04:24:44.274669] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000a5b5 00:08:05.601 [2024-11-17 04:24:44.275062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.601 [2024-11-17 04:24:44.275090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.601 [2024-11-17 04:24:44.275212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffe381ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.601 [2024-11-17 04:24:44.275232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.601 [2024-11-17 04:24:44.275352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:67b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.601 [2024-11-17 04:24:44.275371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.601 [2024-11-17 04:24:44.275493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:4a4f024a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.601 [2024-11-17 04:24:44.275510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.601 #44 NEW cov: 12518 ft: 15961 corp: 27/488b lim: 30 exec/s: 44 rss: 74Mb L: 29/30 MS: 1 InsertRepeatedBytes- 00:08:05.601 [2024-11-17 04:24:44.344251] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:05.601 [2024-11-17 04:24:44.344445] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004ab5 00:08:05.601 [2024-11-17 04:24:44.344623] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (448216) > buf size (4096) 00:08:05.601 [2024-11-17 04:24:44.345017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.601 [2024-11-17 04:24:44.345046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.601 [2024-11-17 04:24:44.345158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b54a024f cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.601 [2024-11-17 04:24:44.345178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.601 [2024-11-17 04:24:44.345293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.601 [2024-11-17 04:24:44.345311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.601 #45 NEW cov: 12518 ft: 15962 corp: 28/507b lim: 30 exec/s: 45 rss: 74Mb L: 19/30 MS: 1 InsertByte- 00:08:05.601 [2024-11-17 04:24:44.414653] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:05.601 [2024-11-17 04:24:44.414857] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524288) > buf size (4096) 00:08:05.601 [2024-11-17 04:24:44.415039] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004f4a 00:08:05.601 [2024-11-17 04:24:44.415218] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:05.601 [2024-11-17 04:24:44.415603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.601 [2024-11-17 04:24:44.415633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.601 [2024-11-17 04:24:44.415765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff81ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.601 [2024-11-17 04:24:44.415782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.601 [2024-11-17 04:24:44.415897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000205 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.601 [2024-11-17 04:24:44.415913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.601 [2024-11-17 04:24:44.416042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:4ab581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.601 [2024-11-17 04:24:44.416064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.862 #46 NEW cov: 12518 ft: 15996 corp: 29/533b lim: 30 exec/s: 46 rss: 74Mb L: 26/30 MS: 1 PersAutoDict- DE: "\000\000\000\005"- 00:08:05.862 [2024-11-17 04:24:44.464851] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:05.862 [2024-11-17 04:24:44.465040] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100006767 00:08:05.862 [2024-11-17 04:24:44.465228] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:08:05.862 [2024-11-17 04:24:44.465412] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000a5b5 00:08:05.862 [2024-11-17 04:24:44.465798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.862 [2024-11-17 04:24:44.465826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.862 [2024-11-17 04:24:44.465955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffe381ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.862 [2024-11-17 04:24:44.465974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.862 [2024-11-17 04:24:44.466099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:67b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.862 [2024-11-17 04:24:44.466115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.862 [2024-11-17 04:24:44.466241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:4a4f024a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.862 [2024-11-17 04:24:44.466259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.862 #47 NEW cov: 12518 ft: 16006 corp: 30/562b lim: 30 exec/s: 23 rss: 74Mb L: 29/30 MS: 1 ShuffleBytes- 00:08:05.862 #47 DONE cov: 12518 ft: 16006 corp: 30/562b lim: 30 exec/s: 23 rss: 74Mb 00:08:05.862 ###### Recommended dictionary. ###### 00:08:05.862 "\377\377\377\377\377\377\377\377" # Uses: 2 00:08:05.862 "\000\000\000\005" # Uses: 2 00:08:05.862 ###### End of recommended dictionary. ###### 00:08:05.862 Done 47 runs in 2 second(s) 00:08:05.862 04:24:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_1.conf /var/tmp/suppress_nvmf_fuzz 00:08:05.862 04:24:44 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:05.862 04:24:44 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:05.862 04:24:44 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:05.862 04:24:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:08:05.862 04:24:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:05.862 04:24:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:05.862 04:24:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:05.862 04:24:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:08:05.862 04:24:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:05.862 04:24:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:05.862 04:24:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 2 00:08:05.862 04:24:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4402 00:08:05.862 04:24:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:05.862 04:24:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:08:05.862 04:24:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:05.862 04:24:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:05.862 04:24:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:05.862 04:24:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 00:08:05.862 [2024-11-17 04:24:44.653513] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:05.862 [2024-11-17 04:24:44.653579] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid149443 ] 00:08:06.123 [2024-11-17 04:24:44.851471] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.123 [2024-11-17 04:24:44.864664] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.123 [2024-11-17 04:24:44.917506] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:06.123 [2024-11-17 04:24:44.933819] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:08:06.123 INFO: Running with entropic power schedule (0xFF, 100). 00:08:06.123 INFO: Seed: 2237431088 00:08:06.383 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:06.383 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:06.383 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:06.383 INFO: A corpus is not provided, starting from an empty corpus 00:08:06.383 #2 INITED exec/s: 0 rss: 66Mb 00:08:06.383 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:06.383 This may also happen if the target rejected all inputs we tried so far 00:08:06.383 [2024-11-17 04:24:44.992276] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:06.383 [2024-11-17 04:24:44.992412] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:06.383 [2024-11-17 04:24:44.992642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000004b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.383 [2024-11-17 04:24:44.992670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.383 [2024-11-17 04:24:44.992736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.383 [2024-11-17 04:24:44.992752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.383 [2024-11-17 04:24:44.992801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.383 [2024-11-17 04:24:44.992817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.644 NEW_FUNC[1/715]: 0x455b38 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:08:06.644 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:06.644 #4 NEW cov: 12235 ft: 12236 corp: 2/24b lim: 35 exec/s: 0 rss: 72Mb L: 23/23 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:06.644 [2024-11-17 04:24:45.323301] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:06.644 [2024-11-17 04:24:45.323447] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:06.644 [2024-11-17 04:24:45.323719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000004b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.644 [2024-11-17 04:24:45.323774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.644 [2024-11-17 04:24:45.323852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.644 [2024-11-17 04:24:45.323883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.644 [2024-11-17 04:24:45.323961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.644 [2024-11-17 04:24:45.323991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.644 #5 NEW cov: 12348 ft: 12842 corp: 3/47b lim: 35 exec/s: 0 rss: 72Mb L: 23/23 MS: 1 ChangeBinInt- 00:08:06.644 [2024-11-17 04:24:45.393218] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:06.644 [2024-11-17 04:24:45.393346] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:06.644 [2024-11-17 04:24:45.393574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000004b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.644 [2024-11-17 04:24:45.393602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.644 [2024-11-17 04:24:45.393656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.644 [2024-11-17 04:24:45.393674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.644 [2024-11-17 04:24:45.393712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.644 [2024-11-17 04:24:45.393727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.644 #6 NEW cov: 12354 ft: 13144 corp: 4/70b lim: 35 exec/s: 0 rss: 72Mb L: 23/23 MS: 1 ChangeBit- 00:08:06.644 [2024-11-17 04:24:45.433383] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:06.644 [2024-11-17 04:24:45.433613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff89004b cdw11:6f007688 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.644 [2024-11-17 04:24:45.433639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.644 [2024-11-17 04:24:45.433690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:70000097 cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.644 [2024-11-17 04:24:45.433709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.644 [2024-11-17 04:24:45.433760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.644 [2024-11-17 04:24:45.433779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.905 #7 NEW cov: 12439 ft: 13579 corp: 5/93b lim: 35 exec/s: 0 rss: 72Mb L: 23/23 MS: 1 CMP- DE: "\377\211v\210oD\227p"- 00:08:06.905 [2024-11-17 04:24:45.493639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff89004b cdw11:00007000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.905 [2024-11-17 04:24:45.493666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.905 [2024-11-17 04:24:45.493721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:fd0000ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.905 [2024-11-17 04:24:45.493752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.905 #8 NEW cov: 12439 ft: 13994 corp: 6/111b lim: 35 exec/s: 0 rss: 72Mb L: 18/23 MS: 1 EraseBytes- 00:08:06.905 [2024-11-17 04:24:45.553775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffdd004b cdw11:00007000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.905 [2024-11-17 04:24:45.553801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.905 [2024-11-17 04:24:45.553854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:fd0000ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.905 [2024-11-17 04:24:45.553869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.905 #9 NEW cov: 12439 ft: 14049 corp: 7/129b lim: 35 exec/s: 0 rss: 73Mb L: 18/23 MS: 1 ChangeByte- 00:08:06.905 [2024-11-17 04:24:45.613895] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:06.905 [2024-11-17 04:24:45.614128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff89004b cdw11:6f007688 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.905 [2024-11-17 04:24:45.614170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.905 [2024-11-17 04:24:45.614222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:70000097 cdw11:ff002000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.905 [2024-11-17 04:24:45.614236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.905 [2024-11-17 04:24:45.614286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.905 [2024-11-17 04:24:45.614301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.905 #10 NEW cov: 12439 ft: 14198 corp: 8/152b lim: 35 exec/s: 0 rss: 73Mb L: 23/23 MS: 1 ChangeBit- 00:08:06.905 [2024-11-17 04:24:45.653980] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:06.905 [2024-11-17 04:24:45.654230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffdd004b cdw11:00007000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.905 [2024-11-17 04:24:45.654254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.905 [2024-11-17 04:24:45.654307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:fd0000ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.905 [2024-11-17 04:24:45.654321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.905 [2024-11-17 04:24:45.654372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:890000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.905 [2024-11-17 04:24:45.654386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.905 #11 NEW cov: 12439 ft: 14293 corp: 9/178b lim: 35 exec/s: 0 rss: 73Mb L: 26/26 MS: 1 PersAutoDict- DE: "\377\211v\210oD\227p"- 00:08:06.905 [2024-11-17 04:24:45.714086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.905 [2024-11-17 04:24:45.714111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.165 #12 NEW cov: 12439 ft: 14624 corp: 10/186b lim: 35 exec/s: 0 rss: 73Mb L: 8/26 MS: 1 InsertRepeatedBytes- 00:08:07.165 [2024-11-17 04:24:45.754315] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.165 [2024-11-17 04:24:45.754545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff89004b cdw11:6f007688 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.165 [2024-11-17 04:24:45.754570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.165 [2024-11-17 04:24:45.754623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:70000097 cdw11:0000fffd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.166 [2024-11-17 04:24:45.754638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.166 [2024-11-17 04:24:45.754687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.166 [2024-11-17 04:24:45.754709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.166 #13 NEW cov: 12439 ft: 14704 corp: 11/209b lim: 35 exec/s: 0 rss: 73Mb L: 23/26 MS: 1 CopyPart- 00:08:07.166 [2024-11-17 04:24:45.794533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.166 [2024-11-17 04:24:45.794558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.166 [2024-11-17 04:24:45.794612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:baba00ba cdw11:ba00baba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.166 [2024-11-17 04:24:45.794625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.166 [2024-11-17 04:24:45.794674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:baba00ba cdw11:ba00baba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.166 [2024-11-17 04:24:45.794688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.166 #14 NEW cov: 12439 ft: 14740 corp: 12/233b lim: 35 exec/s: 0 rss: 73Mb L: 24/26 MS: 1 InsertRepeatedBytes- 00:08:07.166 [2024-11-17 04:24:45.854478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.166 [2024-11-17 04:24:45.854503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.166 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:07.166 #15 NEW cov: 12462 ft: 14815 corp: 13/243b lim: 35 exec/s: 0 rss: 73Mb L: 10/26 MS: 1 CopyPart- 00:08:07.166 [2024-11-17 04:24:45.894592] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.166 [2024-11-17 04:24:45.894755] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.166 [2024-11-17 04:24:45.894880] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.166 [2024-11-17 04:24:45.895107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000004b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.166 [2024-11-17 04:24:45.895133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.166 [2024-11-17 04:24:45.895187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.166 [2024-11-17 04:24:45.895203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.166 [2024-11-17 04:24:45.895257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.166 [2024-11-17 04:24:45.895272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.166 [2024-11-17 04:24:45.895323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00ff0000 cdw11:88008976 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.166 [2024-11-17 04:24:45.895339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.166 #16 NEW cov: 12462 ft: 15357 corp: 14/274b lim: 35 exec/s: 0 rss: 73Mb L: 31/31 MS: 1 PersAutoDict- DE: "\377\211v\210oD\227p"- 00:08:07.166 [2024-11-17 04:24:45.934684] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.166 [2024-11-17 04:24:45.934818] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.166 [2024-11-17 04:24:45.935055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000004b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.166 [2024-11-17 04:24:45.935081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.166 [2024-11-17 04:24:45.935135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.166 [2024-11-17 04:24:45.935151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.166 [2024-11-17 04:24:45.935202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.166 [2024-11-17 04:24:45.935217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.166 #17 NEW cov: 12462 ft: 15367 corp: 15/297b lim: 35 exec/s: 0 rss: 73Mb L: 23/31 MS: 1 ShuffleBytes- 00:08:07.166 [2024-11-17 04:24:45.974824] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.166 [2024-11-17 04:24:45.975055] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.166 [2024-11-17 04:24:45.975283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000004b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.166 [2024-11-17 04:24:45.975308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.166 [2024-11-17 04:24:45.975360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.166 [2024-11-17 04:24:45.975376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.166 [2024-11-17 04:24:45.975427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ff0000ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.166 [2024-11-17 04:24:45.975441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.166 [2024-11-17 04:24:45.975490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.166 [2024-11-17 04:24:45.975506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.427 #18 NEW cov: 12462 ft: 15386 corp: 16/328b lim: 35 exec/s: 18 rss: 73Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:08:07.427 [2024-11-17 04:24:46.014897] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.427 [2024-11-17 04:24:46.015230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000004b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.427 [2024-11-17 04:24:46.015255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.427 [2024-11-17 04:24:46.015307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.427 [2024-11-17 04:24:46.015323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.427 [2024-11-17 04:24:46.015373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ff0000ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.427 [2024-11-17 04:24:46.015387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.427 #19 NEW cov: 12462 ft: 15400 corp: 17/355b lim: 35 exec/s: 19 rss: 73Mb L: 27/31 MS: 1 EraseBytes- 00:08:07.427 [2024-11-17 04:24:46.075106] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.427 [2024-11-17 04:24:46.075446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000004b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.427 [2024-11-17 04:24:46.075472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.427 [2024-11-17 04:24:46.075525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.427 [2024-11-17 04:24:46.075540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.427 [2024-11-17 04:24:46.075592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ff0000ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.427 [2024-11-17 04:24:46.075606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.427 #20 NEW cov: 12462 ft: 15421 corp: 18/382b lim: 35 exec/s: 20 rss: 73Mb L: 27/31 MS: 1 ChangeBinInt- 00:08:07.427 [2024-11-17 04:24:46.135255] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.427 [2024-11-17 04:24:46.135389] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.427 [2024-11-17 04:24:46.135620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffdd004b cdw11:00007000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.427 [2024-11-17 04:24:46.135645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.427 [2024-11-17 04:24:46.135701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:fd000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.427 [2024-11-17 04:24:46.135716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.427 [2024-11-17 04:24:46.135768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:890000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.427 [2024-11-17 04:24:46.135786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.427 #21 NEW cov: 12462 ft: 15445 corp: 19/408b lim: 35 exec/s: 21 rss: 73Mb L: 26/31 MS: 1 ShuffleBytes- 00:08:07.427 [2024-11-17 04:24:46.195551] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.427 [2024-11-17 04:24:46.195673] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.427 [2024-11-17 04:24:46.195907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff89004b cdw11:6f007688 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.427 [2024-11-17 04:24:46.195937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.427 [2024-11-17 04:24:46.195997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:70000097 cdw11:0000ff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.427 [2024-11-17 04:24:46.196011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.427 [2024-11-17 04:24:46.196064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000fd00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.427 [2024-11-17 04:24:46.196080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.427 [2024-11-17 04:24:46.196138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.427 [2024-11-17 04:24:46.196154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.427 #22 NEW cov: 12462 ft: 15512 corp: 20/437b lim: 35 exec/s: 22 rss: 73Mb L: 29/31 MS: 1 InsertRepeatedBytes- 00:08:07.427 [2024-11-17 04:24:46.256048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff89004b cdw11:7600ff89 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.427 [2024-11-17 04:24:46.256074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.427 [2024-11-17 04:24:46.256131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:4497006f cdw11:88007076 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.427 [2024-11-17 04:24:46.256146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.688 [2024-11-17 04:24:46.256200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:97700044 cdw11:00000020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.688 [2024-11-17 04:24:46.256214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.688 [2024-11-17 04:24:46.256270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:000000fd cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.688 [2024-11-17 04:24:46.256283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.688 #23 NEW cov: 12462 ft: 15570 corp: 21/468b lim: 35 exec/s: 23 rss: 73Mb L: 31/31 MS: 1 PersAutoDict- DE: "\377\211v\210oD\227p"- 00:08:07.688 [2024-11-17 04:24:46.315732] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.688 [2024-11-17 04:24:46.316080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000004b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.688 [2024-11-17 04:24:46.316106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.688 [2024-11-17 04:24:46.316161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.688 [2024-11-17 04:24:46.316176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.688 [2024-11-17 04:24:46.316232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ff0000ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.688 [2024-11-17 04:24:46.316245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.688 #24 NEW cov: 12462 ft: 15585 corp: 22/495b lim: 35 exec/s: 24 rss: 74Mb L: 27/31 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:07.688 [2024-11-17 04:24:46.376275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff89004b cdw11:9d00769d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.688 [2024-11-17 04:24:46.376300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.688 [2024-11-17 04:24:46.376356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:9d9d009d cdw11:9d009d9d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.688 [2024-11-17 04:24:46.376370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.688 [2024-11-17 04:24:46.376425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:4497006f cdw11:20007000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.688 [2024-11-17 04:24:46.376438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.688 [2024-11-17 04:24:46.376494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:fd0000ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.688 [2024-11-17 04:24:46.376507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.688 #25 NEW cov: 12462 ft: 15603 corp: 23/527b lim: 35 exec/s: 25 rss: 74Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:08:07.688 [2024-11-17 04:24:46.416150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff5d004b cdw11:00007000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.688 [2024-11-17 04:24:46.416175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.688 [2024-11-17 04:24:46.416231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:fd0000ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.688 [2024-11-17 04:24:46.416244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.688 #26 NEW cov: 12462 ft: 15609 corp: 24/545b lim: 35 exec/s: 26 rss: 74Mb L: 18/32 MS: 1 ChangeBit- 00:08:07.688 [2024-11-17 04:24:46.456297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffdd004b cdw11:00007000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.688 [2024-11-17 04:24:46.456323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.688 [2024-11-17 04:24:46.456397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:fd0000ff cdw11:bf000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.688 [2024-11-17 04:24:46.456411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.688 #27 NEW cov: 12462 ft: 15655 corp: 25/563b lim: 35 exec/s: 27 rss: 74Mb L: 18/32 MS: 1 ChangeByte- 00:08:07.688 [2024-11-17 04:24:46.496325] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.688 [2024-11-17 04:24:46.496554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff89004b cdw11:6f007688 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.688 [2024-11-17 04:24:46.496580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.688 [2024-11-17 04:24:46.496638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:70000097 cdw11:ff002000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.688 [2024-11-17 04:24:46.496652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.688 [2024-11-17 04:24:46.496713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000400 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.688 [2024-11-17 04:24:46.496733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.949 #28 NEW cov: 12462 ft: 15659 corp: 26/586b lim: 35 exec/s: 28 rss: 74Mb L: 23/32 MS: 1 ChangeBit- 00:08:07.949 [2024-11-17 04:24:46.536331] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.949 [2024-11-17 04:24:46.536668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000004b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.949 [2024-11-17 04:24:46.536700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.949 [2024-11-17 04:24:46.536754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.949 [2024-11-17 04:24:46.536769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.949 [2024-11-17 04:24:46.536824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:76880089 cdw11:97006f44 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.949 [2024-11-17 04:24:46.536838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.949 #29 NEW cov: 12462 ft: 15688 corp: 27/609b lim: 35 exec/s: 29 rss: 74Mb L: 23/32 MS: 1 PersAutoDict- DE: "\377\211v\210oD\227p"- 00:08:07.949 [2024-11-17 04:24:46.576630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff5d004b cdw11:3d007000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.949 [2024-11-17 04:24:46.576655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.949 [2024-11-17 04:24:46.576734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:fd0000ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.949 [2024-11-17 04:24:46.576750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.949 #30 NEW cov: 12462 ft: 15734 corp: 28/627b lim: 35 exec/s: 30 rss: 74Mb L: 18/32 MS: 1 ChangeByte- 00:08:07.949 [2024-11-17 04:24:46.636756] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.949 [2024-11-17 04:24:46.637102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff89004b cdw11:6f007688 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.949 [2024-11-17 04:24:46.637128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.949 [2024-11-17 04:24:46.637182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:70000097 cdw11:ff002000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.949 [2024-11-17 04:24:46.637196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.949 [2024-11-17 04:24:46.637248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000400 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.949 [2024-11-17 04:24:46.637264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.949 [2024-11-17 04:24:46.637317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:3d3d003d cdw11:00003d3d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.949 [2024-11-17 04:24:46.637330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.949 #31 NEW cov: 12462 ft: 15764 corp: 29/655b lim: 35 exec/s: 31 rss: 74Mb L: 28/32 MS: 1 InsertRepeatedBytes- 00:08:07.949 [2024-11-17 04:24:46.696871] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.949 [2024-11-17 04:24:46.697013] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.949 [2024-11-17 04:24:46.697241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000004b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.949 [2024-11-17 04:24:46.697265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.949 [2024-11-17 04:24:46.697317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.949 [2024-11-17 04:24:46.697333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.949 [2024-11-17 04:24:46.697387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.949 [2024-11-17 04:24:46.697402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.949 #32 NEW cov: 12462 ft: 15812 corp: 30/678b lim: 35 exec/s: 32 rss: 74Mb L: 23/32 MS: 1 ChangeByte- 00:08:07.949 [2024-11-17 04:24:46.757004] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.949 [2024-11-17 04:24:46.757127] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.949 [2024-11-17 04:24:46.757339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000004b cdw11:06000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.949 [2024-11-17 04:24:46.757366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.949 [2024-11-17 04:24:46.757420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.949 [2024-11-17 04:24:46.757437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.949 [2024-11-17 04:24:46.757489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.949 [2024-11-17 04:24:46.757504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.210 #33 NEW cov: 12462 ft: 15818 corp: 31/701b lim: 35 exec/s: 33 rss: 74Mb L: 23/32 MS: 1 ChangeBinInt- 00:08:08.210 [2024-11-17 04:24:46.797152] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:08.210 [2024-11-17 04:24:46.797373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffdd004b cdw11:00007000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.210 [2024-11-17 04:24:46.797400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.210 [2024-11-17 04:24:46.797455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:fd0000ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.210 [2024-11-17 04:24:46.797469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.210 [2024-11-17 04:24:46.797519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:04000000 cdw11:890000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.210 [2024-11-17 04:24:46.797533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.210 #34 NEW cov: 12462 ft: 15837 corp: 32/727b lim: 35 exec/s: 34 rss: 74Mb L: 26/32 MS: 1 ChangeBinInt- 00:08:08.210 [2024-11-17 04:24:46.837178] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:08.210 [2024-11-17 04:24:46.837303] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:08.210 [2024-11-17 04:24:46.837517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff5d004b cdw11:00007000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.210 [2024-11-17 04:24:46.837543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.210 [2024-11-17 04:24:46.837599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.210 [2024-11-17 04:24:46.837615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.210 [2024-11-17 04:24:46.837666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:fffd0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.210 [2024-11-17 04:24:46.837681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.210 #35 NEW cov: 12462 ft: 15846 corp: 33/753b lim: 35 exec/s: 35 rss: 74Mb L: 26/32 MS: 1 CMP- DE: "\000\004\000\000\000\000\000\000"- 00:08:08.210 [2024-11-17 04:24:46.877447] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:08.210 [2024-11-17 04:24:46.877567] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:08.211 [2024-11-17 04:24:46.877785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffdd004b cdw11:00007000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.211 [2024-11-17 04:24:46.877811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.211 [2024-11-17 04:24:46.877864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:fd2c00ff cdw11:2c002c2c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.211 [2024-11-17 04:24:46.877879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.211 [2024-11-17 04:24:46.877931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.211 [2024-11-17 04:24:46.877947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.211 [2024-11-17 04:24:46.877998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ff890000 cdw11:6f007688 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.211 [2024-11-17 04:24:46.878012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.211 #36 NEW cov: 12462 ft: 15849 corp: 34/783b lim: 35 exec/s: 36 rss: 74Mb L: 30/32 MS: 1 InsertRepeatedBytes- 00:08:08.211 [2024-11-17 04:24:46.917744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000004b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.211 [2024-11-17 04:24:46.917770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.211 [2024-11-17 04:24:46.917827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.211 [2024-11-17 04:24:46.917840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.211 [2024-11-17 04:24:46.917892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:000000ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.211 [2024-11-17 04:24:46.917905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.211 #37 NEW cov: 12462 ft: 15853 corp: 35/810b lim: 35 exec/s: 37 rss: 74Mb L: 27/32 MS: 1 CopyPart- 00:08:08.211 [2024-11-17 04:24:46.957846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff89004b cdw11:6f007688 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.211 [2024-11-17 04:24:46.957871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.211 [2024-11-17 04:24:46.957945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:76700097 cdw11:00000020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.211 [2024-11-17 04:24:46.957959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.211 [2024-11-17 04:24:46.958014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:000000fd cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.211 [2024-11-17 04:24:46.958027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.211 #38 NEW cov: 12462 ft: 15868 corp: 36/834b lim: 35 exec/s: 19 rss: 74Mb L: 24/32 MS: 1 CrossOver- 00:08:08.211 #38 DONE cov: 12462 ft: 15868 corp: 36/834b lim: 35 exec/s: 19 rss: 74Mb 00:08:08.211 ###### Recommended dictionary. ###### 00:08:08.211 "\377\211v\210oD\227p" # Uses: 4 00:08:08.211 "\000\000\000\000\000\000\000\000" # Uses: 0 00:08:08.211 "\000\004\000\000\000\000\000\000" # Uses: 0 00:08:08.211 ###### End of recommended dictionary. ###### 00:08:08.211 Done 38 runs in 2 second(s) 00:08:08.471 04:24:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_2.conf /var/tmp/suppress_nvmf_fuzz 00:08:08.471 04:24:47 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:08.471 04:24:47 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:08.471 04:24:47 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:08.471 04:24:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:08:08.471 04:24:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:08.471 04:24:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:08.471 04:24:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:08.471 04:24:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:08:08.471 04:24:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:08.471 04:24:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:08.471 04:24:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 3 00:08:08.471 04:24:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4403 00:08:08.471 04:24:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:08.471 04:24:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:08:08.471 04:24:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:08.471 04:24:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:08.471 04:24:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:08.471 04:24:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 00:08:08.471 [2024-11-17 04:24:47.121651] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:08.471 [2024-11-17 04:24:47.121721] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid149914 ] 00:08:08.731 [2024-11-17 04:24:47.320774] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.731 [2024-11-17 04:24:47.333424] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.732 [2024-11-17 04:24:47.385856] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:08.732 [2024-11-17 04:24:47.402183] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:08:08.732 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.732 INFO: Seed: 413452758 00:08:08.732 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:08.732 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:08.732 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:08.732 INFO: A corpus is not provided, starting from an empty corpus 00:08:08.732 #2 INITED exec/s: 0 rss: 65Mb 00:08:08.732 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:08.732 This may also happen if the target rejected all inputs we tried so far 00:08:08.991 NEW_FUNC[1/704]: 0x457818 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:08:08.991 NEW_FUNC[2/704]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:08.991 #16 NEW cov: 12142 ft: 12141 corp: 2/19b lim: 20 exec/s: 0 rss: 72Mb L: 18/18 MS: 4 ChangeByte-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:08:09.251 #17 NEW cov: 12257 ft: 12824 corp: 3/37b lim: 20 exec/s: 0 rss: 72Mb L: 18/18 MS: 1 ChangeBinInt- 00:08:09.251 #18 NEW cov: 12263 ft: 13150 corp: 4/55b lim: 20 exec/s: 0 rss: 72Mb L: 18/18 MS: 1 ChangeByte- 00:08:09.251 #24 NEW cov: 12348 ft: 13376 corp: 5/74b lim: 20 exec/s: 0 rss: 72Mb L: 19/19 MS: 1 InsertByte- 00:08:09.511 #25 NEW cov: 12353 ft: 13822 corp: 6/84b lim: 20 exec/s: 0 rss: 72Mb L: 10/19 MS: 1 EraseBytes- 00:08:09.511 #26 NEW cov: 12353 ft: 14028 corp: 7/102b lim: 20 exec/s: 0 rss: 72Mb L: 18/19 MS: 1 CrossOver- 00:08:09.511 #27 NEW cov: 12353 ft: 14224 corp: 8/121b lim: 20 exec/s: 0 rss: 72Mb L: 19/19 MS: 1 InsertByte- 00:08:09.511 #28 NEW cov: 12353 ft: 14304 corp: 9/139b lim: 20 exec/s: 0 rss: 72Mb L: 18/19 MS: 1 ShuffleBytes- 00:08:09.511 #29 NEW cov: 12353 ft: 14394 corp: 10/157b lim: 20 exec/s: 0 rss: 72Mb L: 18/19 MS: 1 ShuffleBytes- 00:08:09.771 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:09.771 #30 NEW cov: 12376 ft: 14461 corp: 11/176b lim: 20 exec/s: 0 rss: 72Mb L: 19/19 MS: 1 ChangeBit- 00:08:09.771 #31 NEW cov: 12376 ft: 14545 corp: 12/196b lim: 20 exec/s: 0 rss: 72Mb L: 20/20 MS: 1 CopyPart- 00:08:09.771 #32 NEW cov: 12376 ft: 14561 corp: 13/214b lim: 20 exec/s: 32 rss: 72Mb L: 18/20 MS: 1 ShuffleBytes- 00:08:09.771 #33 NEW cov: 12376 ft: 14608 corp: 14/232b lim: 20 exec/s: 33 rss: 72Mb L: 18/20 MS: 1 ChangeByte- 00:08:09.771 #34 NEW cov: 12376 ft: 14655 corp: 15/250b lim: 20 exec/s: 34 rss: 72Mb L: 18/20 MS: 1 ShuffleBytes- 00:08:10.031 #35 NEW cov: 12376 ft: 14704 corp: 16/268b lim: 20 exec/s: 35 rss: 72Mb L: 18/20 MS: 1 ChangeByte- 00:08:10.031 #36 NEW cov: 12376 ft: 14715 corp: 17/286b lim: 20 exec/s: 36 rss: 72Mb L: 18/20 MS: 1 CrossOver- 00:08:10.031 #37 NEW cov: 12376 ft: 14729 corp: 18/305b lim: 20 exec/s: 37 rss: 72Mb L: 19/20 MS: 1 InsertByte- 00:08:10.031 #38 NEW cov: 12376 ft: 14762 corp: 19/325b lim: 20 exec/s: 38 rss: 72Mb L: 20/20 MS: 1 InsertByte- 00:08:10.031 #44 NEW cov: 12376 ft: 14773 corp: 20/343b lim: 20 exec/s: 44 rss: 72Mb L: 18/20 MS: 1 CrossOver- 00:08:10.291 #45 NEW cov: 12376 ft: 14780 corp: 21/361b lim: 20 exec/s: 45 rss: 72Mb L: 18/20 MS: 1 CMP- DE: "\377\211v\212k/\035X"- 00:08:10.291 #46 NEW cov: 12376 ft: 14810 corp: 22/381b lim: 20 exec/s: 46 rss: 73Mb L: 20/20 MS: 1 CopyPart- 00:08:10.291 #47 NEW cov: 12376 ft: 14816 corp: 23/399b lim: 20 exec/s: 47 rss: 73Mb L: 18/20 MS: 1 ChangeBit- 00:08:10.291 #48 NEW cov: 12380 ft: 14934 corp: 24/414b lim: 20 exec/s: 48 rss: 73Mb L: 15/20 MS: 1 EraseBytes- 00:08:10.291 #49 NEW cov: 12380 ft: 14981 corp: 25/432b lim: 20 exec/s: 49 rss: 73Mb L: 18/20 MS: 1 ChangeBit- 00:08:10.291 #50 NEW cov: 12380 ft: 14992 corp: 26/450b lim: 20 exec/s: 50 rss: 73Mb L: 18/20 MS: 1 ChangeBinInt- 00:08:10.555 #51 NEW cov: 12380 ft: 15018 corp: 27/461b lim: 20 exec/s: 51 rss: 73Mb L: 11/20 MS: 1 EraseBytes- 00:08:10.555 #52 NEW cov: 12380 ft: 15099 corp: 28/470b lim: 20 exec/s: 52 rss: 73Mb L: 9/20 MS: 1 EraseBytes- 00:08:10.555 #53 NEW cov: 12380 ft: 15108 corp: 29/488b lim: 20 exec/s: 53 rss: 73Mb L: 18/20 MS: 1 ShuffleBytes- 00:08:10.555 #54 NEW cov: 12380 ft: 15113 corp: 30/506b lim: 20 exec/s: 54 rss: 73Mb L: 18/20 MS: 1 ShuffleBytes- 00:08:10.555 #55 NEW cov: 12380 ft: 15122 corp: 31/517b lim: 20 exec/s: 55 rss: 73Mb L: 11/20 MS: 1 CrossOver- 00:08:10.820 #56 NEW cov: 12380 ft: 15139 corp: 32/537b lim: 20 exec/s: 56 rss: 73Mb L: 20/20 MS: 1 ShuffleBytes- 00:08:10.820 #57 NEW cov: 12380 ft: 15154 corp: 33/547b lim: 20 exec/s: 28 rss: 73Mb L: 10/20 MS: 1 ChangeByte- 00:08:10.820 #57 DONE cov: 12380 ft: 15154 corp: 33/547b lim: 20 exec/s: 28 rss: 73Mb 00:08:10.820 ###### Recommended dictionary. ###### 00:08:10.820 "\377\211v\212k/\035X" # Uses: 0 00:08:10.820 ###### End of recommended dictionary. ###### 00:08:10.820 Done 57 runs in 2 second(s) 00:08:10.820 04:24:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_3.conf /var/tmp/suppress_nvmf_fuzz 00:08:10.820 04:24:49 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:10.820 04:24:49 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:10.820 04:24:49 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:10.820 04:24:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:08:10.820 04:24:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:10.820 04:24:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:10.820 04:24:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:10.820 04:24:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:08:10.820 04:24:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:10.820 04:24:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:10.820 04:24:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 4 00:08:10.820 04:24:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4404 00:08:10.820 04:24:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:10.820 04:24:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:08:10.820 04:24:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:10.820 04:24:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:10.820 04:24:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:10.820 04:24:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 00:08:10.820 [2024-11-17 04:24:49.620870] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:10.820 [2024-11-17 04:24:49.620941] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid150269 ] 00:08:11.080 [2024-11-17 04:24:49.820692] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.080 [2024-11-17 04:24:49.833182] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.080 [2024-11-17 04:24:49.885931] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:11.080 [2024-11-17 04:24:49.902256] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:08:11.339 INFO: Running with entropic power schedule (0xFF, 100). 00:08:11.339 INFO: Seed: 2911449974 00:08:11.339 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:11.339 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:11.339 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:11.339 INFO: A corpus is not provided, starting from an empty corpus 00:08:11.339 #2 INITED exec/s: 0 rss: 65Mb 00:08:11.339 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:11.339 This may also happen if the target rejected all inputs we tried so far 00:08:11.339 [2024-11-17 04:24:49.959608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:53530a53 cdw11:53530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.339 [2024-11-17 04:24:49.959637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.339 [2024-11-17 04:24:49.959691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:53535353 cdw11:53530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.339 [2024-11-17 04:24:49.959710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.339 [2024-11-17 04:24:49.959776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:53535353 cdw11:53530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.339 [2024-11-17 04:24:49.959790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.599 NEW_FUNC[1/716]: 0x458918 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:08:11.599 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:11.599 #4 NEW cov: 12245 ft: 12244 corp: 2/25b lim: 35 exec/s: 0 rss: 72Mb L: 24/24 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:11.599 [2024-11-17 04:24:50.311376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff2dff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.599 [2024-11-17 04:24:50.311429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.599 #7 NEW cov: 12358 ft: 13591 corp: 3/38b lim: 35 exec/s: 0 rss: 72Mb L: 13/24 MS: 3 ChangeByte-ChangeByte-InsertRepeatedBytes- 00:08:11.599 [2024-11-17 04:24:50.371884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:53530a53 cdw11:53530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.599 [2024-11-17 04:24:50.371913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.599 [2024-11-17 04:24:50.372033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:53535353 cdw11:53530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.599 [2024-11-17 04:24:50.372048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.599 [2024-11-17 04:24:50.372165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:53535353 cdw11:53530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.599 [2024-11-17 04:24:50.372182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.599 #13 NEW cov: 12364 ft: 13722 corp: 4/62b lim: 35 exec/s: 0 rss: 72Mb L: 24/24 MS: 1 CopyPart- 00:08:11.859 [2024-11-17 04:24:50.442049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff2dff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.859 [2024-11-17 04:24:50.442076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.859 [2024-11-17 04:24:50.442221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.859 [2024-11-17 04:24:50.442241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.859 [2024-11-17 04:24:50.442352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.859 [2024-11-17 04:24:50.442370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.859 #14 NEW cov: 12449 ft: 14041 corp: 5/83b lim: 35 exec/s: 0 rss: 72Mb L: 21/24 MS: 1 InsertRepeatedBytes- 00:08:11.859 [2024-11-17 04:24:50.512025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff2dff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.859 [2024-11-17 04:24:50.512054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.859 [2024-11-17 04:24:50.512174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.859 [2024-11-17 04:24:50.512192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.859 #15 NEW cov: 12449 ft: 14429 corp: 6/101b lim: 35 exec/s: 0 rss: 72Mb L: 18/24 MS: 1 CopyPart- 00:08:11.859 [2024-11-17 04:24:50.562716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:56560a56 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.859 [2024-11-17 04:24:50.562743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.859 [2024-11-17 04:24:50.562861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.859 [2024-11-17 04:24:50.562877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.860 [2024-11-17 04:24:50.562989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.860 [2024-11-17 04:24:50.563005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.860 [2024-11-17 04:24:50.563117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.860 [2024-11-17 04:24:50.563135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.860 #16 NEW cov: 12449 ft: 14851 corp: 7/131b lim: 35 exec/s: 0 rss: 72Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:08:11.860 [2024-11-17 04:24:50.612369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff2dff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.860 [2024-11-17 04:24:50.612398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.860 [2024-11-17 04:24:50.612517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.860 [2024-11-17 04:24:50.612533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.860 #17 NEW cov: 12449 ft: 14945 corp: 8/149b lim: 35 exec/s: 0 rss: 72Mb L: 18/30 MS: 1 ShuffleBytes- 00:08:11.860 [2024-11-17 04:24:50.682508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff2dff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.860 [2024-11-17 04:24:50.682536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.860 [2024-11-17 04:24:50.682651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.860 [2024-11-17 04:24:50.682672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.120 #23 NEW cov: 12449 ft: 14963 corp: 9/167b lim: 35 exec/s: 0 rss: 72Mb L: 18/30 MS: 1 ChangeByte- 00:08:12.120 [2024-11-17 04:24:50.733240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff2dff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.120 [2024-11-17 04:24:50.733267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.120 [2024-11-17 04:24:50.733385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffff2d cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.120 [2024-11-17 04:24:50.733403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.120 [2024-11-17 04:24:50.733535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.120 [2024-11-17 04:24:50.733552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.120 [2024-11-17 04:24:50.733662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.120 [2024-11-17 04:24:50.733680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.120 #24 NEW cov: 12449 ft: 15015 corp: 10/201b lim: 35 exec/s: 0 rss: 72Mb L: 34/34 MS: 1 CrossOver- 00:08:12.120 [2024-11-17 04:24:50.803224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff2dff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.120 [2024-11-17 04:24:50.803251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.120 [2024-11-17 04:24:50.803365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ff2dffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.120 [2024-11-17 04:24:50.803382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.120 [2024-11-17 04:24:50.803500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.120 [2024-11-17 04:24:50.803516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.120 #25 NEW cov: 12449 ft: 15091 corp: 11/227b lim: 35 exec/s: 0 rss: 72Mb L: 26/34 MS: 1 CopyPart- 00:08:12.120 [2024-11-17 04:24:50.852990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff2dff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.120 [2024-11-17 04:24:50.853016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.120 [2024-11-17 04:24:50.853131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.120 [2024-11-17 04:24:50.853161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.120 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:12.120 #26 NEW cov: 12472 ft: 15123 corp: 12/242b lim: 35 exec/s: 0 rss: 73Mb L: 15/34 MS: 1 EraseBytes- 00:08:12.120 [2024-11-17 04:24:50.923156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff2dff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.120 [2024-11-17 04:24:50.923185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.120 [2024-11-17 04:24:50.923293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.120 [2024-11-17 04:24:50.923310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.120 #27 NEW cov: 12472 ft: 15146 corp: 13/260b lim: 35 exec/s: 0 rss: 73Mb L: 18/34 MS: 1 CopyPart- 00:08:12.380 [2024-11-17 04:24:50.973608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff2dff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.380 [2024-11-17 04:24:50.973635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.380 [2024-11-17 04:24:50.973755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ff2dffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.380 [2024-11-17 04:24:50.973771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.380 [2024-11-17 04:24:50.973890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:fdff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.380 [2024-11-17 04:24:50.973905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.380 #28 NEW cov: 12472 ft: 15193 corp: 14/286b lim: 35 exec/s: 28 rss: 73Mb L: 26/34 MS: 1 ChangeBinInt- 00:08:12.380 [2024-11-17 04:24:51.043349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff2d01 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.380 [2024-11-17 04:24:51.043377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.380 #29 NEW cov: 12472 ft: 15221 corp: 15/299b lim: 35 exec/s: 29 rss: 73Mb L: 13/34 MS: 1 ChangeBinInt- 00:08:12.380 [2024-11-17 04:24:51.094345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:56560a56 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.380 [2024-11-17 04:24:51.094372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.380 [2024-11-17 04:24:51.094493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.380 [2024-11-17 04:24:51.094509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.380 [2024-11-17 04:24:51.094624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:56560056 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.380 [2024-11-17 04:24:51.094641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.380 [2024-11-17 04:24:51.094762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:56565656 cdw11:56560002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.380 [2024-11-17 04:24:51.094780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.380 #30 NEW cov: 12472 ft: 15231 corp: 16/330b lim: 35 exec/s: 30 rss: 73Mb L: 31/34 MS: 1 InsertByte- 00:08:12.380 [2024-11-17 04:24:51.163739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff2dff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.380 [2024-11-17 04:24:51.163766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.380 #31 NEW cov: 12472 ft: 15240 corp: 17/338b lim: 35 exec/s: 31 rss: 73Mb L: 8/34 MS: 1 EraseBytes- 00:08:12.640 [2024-11-17 04:24:51.214485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.640 [2024-11-17 04:24:51.214514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.640 [2024-11-17 04:24:51.214628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.640 [2024-11-17 04:24:51.214644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.640 [2024-11-17 04:24:51.214766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.640 [2024-11-17 04:24:51.214781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.641 #40 NEW cov: 12472 ft: 15251 corp: 18/363b lim: 35 exec/s: 40 rss: 73Mb L: 25/34 MS: 4 ChangeBit-ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:08:12.641 [2024-11-17 04:24:51.264823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:53530a53 cdw11:53530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.641 [2024-11-17 04:24:51.264849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.641 [2024-11-17 04:24:51.264964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:53535353 cdw11:2dff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.641 [2024-11-17 04:24:51.264980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.641 [2024-11-17 04:24:51.265101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.641 [2024-11-17 04:24:51.265116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.641 [2024-11-17 04:24:51.265234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.641 [2024-11-17 04:24:51.265251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.641 #41 NEW cov: 12472 ft: 15257 corp: 19/392b lim: 35 exec/s: 41 rss: 73Mb L: 29/34 MS: 1 CrossOver- 00:08:12.641 [2024-11-17 04:24:51.334415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff2dff cdw11:ff960003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.641 [2024-11-17 04:24:51.334442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.641 [2024-11-17 04:24:51.334559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.641 [2024-11-17 04:24:51.334576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.641 #42 NEW cov: 12472 ft: 15278 corp: 20/411b lim: 35 exec/s: 42 rss: 73Mb L: 19/34 MS: 1 InsertByte- 00:08:12.641 [2024-11-17 04:24:51.384651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:53530a53 cdw11:2dff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.641 [2024-11-17 04:24:51.384679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.641 [2024-11-17 04:24:51.384814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.641 [2024-11-17 04:24:51.384833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.641 #43 NEW cov: 12472 ft: 15284 corp: 21/431b lim: 35 exec/s: 43 rss: 73Mb L: 20/34 MS: 1 CrossOver- 00:08:12.641 [2024-11-17 04:24:51.434938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:53530a53 cdw11:2dff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.641 [2024-11-17 04:24:51.434967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.641 [2024-11-17 04:24:51.435089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.641 [2024-11-17 04:24:51.435106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.901 #44 NEW cov: 12472 ft: 15299 corp: 22/451b lim: 35 exec/s: 44 rss: 73Mb L: 20/34 MS: 1 CrossOver- 00:08:12.901 [2024-11-17 04:24:51.505076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff2dff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.901 [2024-11-17 04:24:51.505105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.901 [2024-11-17 04:24:51.505235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.901 [2024-11-17 04:24:51.505252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.901 #45 NEW cov: 12472 ft: 15318 corp: 23/470b lim: 35 exec/s: 45 rss: 73Mb L: 19/34 MS: 1 EraseBytes- 00:08:12.901 [2024-11-17 04:24:51.575268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:53530a53 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.901 [2024-11-17 04:24:51.575295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.901 [2024-11-17 04:24:51.575416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.901 [2024-11-17 04:24:51.575436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.901 #46 NEW cov: 12472 ft: 15338 corp: 24/489b lim: 35 exec/s: 46 rss: 73Mb L: 19/34 MS: 1 EraseBytes- 00:08:12.901 [2024-11-17 04:24:51.646011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:53530a53 cdw11:2dff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.901 [2024-11-17 04:24:51.646040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.901 [2024-11-17 04:24:51.646168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.901 [2024-11-17 04:24:51.646185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.901 [2024-11-17 04:24:51.646302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.901 [2024-11-17 04:24:51.646318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.901 [2024-11-17 04:24:51.646434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.901 [2024-11-17 04:24:51.646452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.901 #47 NEW cov: 12472 ft: 15351 corp: 25/519b lim: 35 exec/s: 47 rss: 73Mb L: 30/34 MS: 1 CopyPart- 00:08:12.901 [2024-11-17 04:24:51.695584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff2dff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.901 [2024-11-17 04:24:51.695616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.901 [2024-11-17 04:24:51.695751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:25ffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.901 [2024-11-17 04:24:51.695771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.901 #48 NEW cov: 12472 ft: 15417 corp: 26/538b lim: 35 exec/s: 48 rss: 73Mb L: 19/34 MS: 1 InsertByte- 00:08:13.162 [2024-11-17 04:24:51.746434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.162 [2024-11-17 04:24:51.746464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.162 [2024-11-17 04:24:51.746595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.162 [2024-11-17 04:24:51.746612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.162 [2024-11-17 04:24:51.746732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.162 [2024-11-17 04:24:51.746748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.162 [2024-11-17 04:24:51.746870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffff0000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.162 [2024-11-17 04:24:51.746888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.162 #49 NEW cov: 12472 ft: 15452 corp: 27/569b lim: 35 exec/s: 49 rss: 73Mb L: 31/34 MS: 1 InsertRepeatedBytes- 00:08:13.162 [2024-11-17 04:24:51.816091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:53530a53 cdw11:2dff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.162 [2024-11-17 04:24:51.816119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.162 [2024-11-17 04:24:51.816240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fb530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.162 [2024-11-17 04:24:51.816257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.162 #50 NEW cov: 12472 ft: 15495 corp: 28/589b lim: 35 exec/s: 50 rss: 73Mb L: 20/34 MS: 1 ChangeBit- 00:08:13.162 [2024-11-17 04:24:51.866438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:53530a53 cdw11:53530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.162 [2024-11-17 04:24:51.866466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.162 [2024-11-17 04:24:51.866586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:53535353 cdw11:53530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.162 [2024-11-17 04:24:51.866602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.162 [2024-11-17 04:24:51.866722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:53535353 cdw11:53530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.162 [2024-11-17 04:24:51.866738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.162 #51 NEW cov: 12472 ft: 15503 corp: 29/613b lim: 35 exec/s: 51 rss: 73Mb L: 24/34 MS: 1 ChangeByte- 00:08:13.162 [2024-11-17 04:24:51.916399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:53000a53 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.162 [2024-11-17 04:24:51.916426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.162 [2024-11-17 04:24:51.916544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.162 [2024-11-17 04:24:51.916562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.162 #52 NEW cov: 12472 ft: 15535 corp: 30/631b lim: 35 exec/s: 52 rss: 73Mb L: 18/34 MS: 1 CrossOver- 00:08:13.162 [2024-11-17 04:24:51.966625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff2db3 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.162 [2024-11-17 04:24:51.966652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.162 [2024-11-17 04:24:51.966774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.162 [2024-11-17 04:24:51.966800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.162 #53 NEW cov: 12472 ft: 15536 corp: 31/645b lim: 35 exec/s: 26 rss: 73Mb L: 14/34 MS: 1 InsertByte- 00:08:13.162 #53 DONE cov: 12472 ft: 15536 corp: 31/645b lim: 35 exec/s: 26 rss: 73Mb 00:08:13.162 Done 53 runs in 2 second(s) 00:08:13.422 04:24:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_4.conf /var/tmp/suppress_nvmf_fuzz 00:08:13.422 04:24:52 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:13.422 04:24:52 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:13.422 04:24:52 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:13.422 04:24:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:08:13.422 04:24:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:13.422 04:24:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:13.422 04:24:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:13.422 04:24:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:08:13.422 04:24:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:13.422 04:24:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:13.422 04:24:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 5 00:08:13.422 04:24:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4405 00:08:13.422 04:24:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:13.422 04:24:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:08:13.422 04:24:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:13.422 04:24:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:13.422 04:24:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:13.422 04:24:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 00:08:13.422 [2024-11-17 04:24:52.133058] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:13.422 [2024-11-17 04:24:52.133140] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid150802 ] 00:08:13.682 [2024-11-17 04:24:52.335112] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.682 [2024-11-17 04:24:52.348223] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.682 [2024-11-17 04:24:52.400682] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:13.682 [2024-11-17 04:24:52.416966] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:08:13.682 INFO: Running with entropic power schedule (0xFF, 100). 00:08:13.682 INFO: Seed: 1133469384 00:08:13.682 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:13.682 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:13.682 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:13.682 INFO: A corpus is not provided, starting from an empty corpus 00:08:13.682 #2 INITED exec/s: 0 rss: 65Mb 00:08:13.682 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:13.682 This may also happen if the target rejected all inputs we tried so far 00:08:13.682 [2024-11-17 04:24:52.482680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58580002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.682 [2024-11-17 04:24:52.482713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.682 [2024-11-17 04:24:52.482787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:58580002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.682 [2024-11-17 04:24:52.482802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.202 NEW_FUNC[1/716]: 0x45aab8 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:08:14.202 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:14.202 #20 NEW cov: 12238 ft: 12239 corp: 2/25b lim: 45 exec/s: 0 rss: 72Mb L: 24/24 MS: 3 CrossOver-InsertByte-InsertRepeatedBytes- 00:08:14.202 [2024-11-17 04:24:52.813482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fdfd0afd cdw11:fdfd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.202 [2024-11-17 04:24:52.813552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.202 #21 NEW cov: 12368 ft: 13694 corp: 3/35b lim: 45 exec/s: 0 rss: 72Mb L: 10/24 MS: 1 InsertRepeatedBytes- 00:08:14.202 [2024-11-17 04:24:52.863407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58580002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.202 [2024-11-17 04:24:52.863433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.202 [2024-11-17 04:24:52.863502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:58005858 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.202 [2024-11-17 04:24:52.863516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.202 #22 NEW cov: 12374 ft: 14032 corp: 4/59b lim: 45 exec/s: 0 rss: 72Mb L: 24/24 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\003"- 00:08:14.202 [2024-11-17 04:24:52.923604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fdfd0afd cdw11:fdfd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.202 [2024-11-17 04:24:52.923630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.202 [2024-11-17 04:24:52.923687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:6e6e6e6e cdw11:6e6e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.202 [2024-11-17 04:24:52.923707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.202 #28 NEW cov: 12459 ft: 14212 corp: 5/85b lim: 45 exec/s: 0 rss: 72Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:08:14.202 [2024-11-17 04:24:52.983939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58580002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.203 [2024-11-17 04:24:52.983965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.203 [2024-11-17 04:24:52.984018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.203 [2024-11-17 04:24:52.984031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.203 [2024-11-17 04:24:52.984083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00005858 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.203 [2024-11-17 04:24:52.984096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.203 #29 NEW cov: 12459 ft: 14538 corp: 6/116b lim: 45 exec/s: 0 rss: 72Mb L: 31/31 MS: 1 CrossOver- 00:08:14.463 [2024-11-17 04:24:53.043772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fdfd0a0a cdw11:fdfd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.463 [2024-11-17 04:24:53.043798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.463 #30 NEW cov: 12459 ft: 14724 corp: 7/127b lim: 45 exec/s: 0 rss: 72Mb L: 11/31 MS: 1 CrossOver- 00:08:14.463 [2024-11-17 04:24:53.083993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58580002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.463 [2024-11-17 04:24:53.084019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.463 [2024-11-17 04:24:53.084072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:58582a58 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.463 [2024-11-17 04:24:53.084086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.463 #31 NEW cov: 12459 ft: 14805 corp: 8/152b lim: 45 exec/s: 0 rss: 72Mb L: 25/31 MS: 1 InsertByte- 00:08:14.463 [2024-11-17 04:24:53.123992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fdf90afd cdw11:fdfd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.463 [2024-11-17 04:24:53.124019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.463 #32 NEW cov: 12459 ft: 14883 corp: 9/162b lim: 45 exec/s: 0 rss: 72Mb L: 10/31 MS: 1 ChangeBit- 00:08:14.463 [2024-11-17 04:24:53.164097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fdfd0afd cdw11:0afd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.463 [2024-11-17 04:24:53.164123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.463 #33 NEW cov: 12459 ft: 14961 corp: 10/173b lim: 45 exec/s: 0 rss: 72Mb L: 11/31 MS: 1 CrossOver- 00:08:14.463 [2024-11-17 04:24:53.204199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.463 [2024-11-17 04:24:53.204224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.463 #34 NEW cov: 12459 ft: 15038 corp: 11/182b lim: 45 exec/s: 0 rss: 72Mb L: 9/31 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\003"- 00:08:14.463 [2024-11-17 04:24:53.244629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58580002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.463 [2024-11-17 04:24:53.244653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.463 [2024-11-17 04:24:53.244712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:58582a58 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.463 [2024-11-17 04:24:53.244742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.463 [2024-11-17 04:24:53.244793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.463 [2024-11-17 04:24:53.244807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.463 #35 NEW cov: 12459 ft: 15079 corp: 12/217b lim: 45 exec/s: 0 rss: 73Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:14.723 [2024-11-17 04:24:53.304507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fdfd0a5d cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.723 [2024-11-17 04:24:53.304533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.723 #40 NEW cov: 12459 ft: 15122 corp: 13/233b lim: 45 exec/s: 0 rss: 73Mb L: 16/35 MS: 5 EraseBytes-ChangeBit-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:08:14.723 [2024-11-17 04:24:53.344743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58580002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.723 [2024-11-17 04:24:53.344768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.723 [2024-11-17 04:24:53.344835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:58582a58 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.723 [2024-11-17 04:24:53.344849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.723 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:14.723 #41 NEW cov: 12482 ft: 15161 corp: 14/258b lim: 45 exec/s: 0 rss: 73Mb L: 25/35 MS: 1 ChangeBinInt- 00:08:14.723 [2024-11-17 04:24:53.384852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58580002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.723 [2024-11-17 04:24:53.384878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.723 [2024-11-17 04:24:53.384929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:58005858 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.723 [2024-11-17 04:24:53.384943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.723 #42 NEW cov: 12482 ft: 15174 corp: 15/282b lim: 45 exec/s: 0 rss: 73Mb L: 24/35 MS: 1 ShuffleBytes- 00:08:14.723 [2024-11-17 04:24:53.424793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fdfd0afd cdw11:fdfd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.723 [2024-11-17 04:24:53.424818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.723 #43 NEW cov: 12482 ft: 15188 corp: 16/293b lim: 45 exec/s: 0 rss: 73Mb L: 11/35 MS: 1 InsertByte- 00:08:14.723 [2024-11-17 04:24:53.465084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fdfd0afd cdw11:fdfd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.723 [2024-11-17 04:24:53.465108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.723 [2024-11-17 04:24:53.465164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:580a6e03 cdw11:0a6e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.723 [2024-11-17 04:24:53.465177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.723 [2024-11-17 04:24:53.525239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fdfd0afd cdw11:fdfd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.723 [2024-11-17 04:24:53.525263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.723 [2024-11-17 04:24:53.525315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:58f46e03 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.723 [2024-11-17 04:24:53.525329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.723 #45 NEW cov: 12482 ft: 15243 corp: 17/319b lim: 45 exec/s: 45 rss: 73Mb L: 26/35 MS: 2 CrossOver-CMP- DE: "\364\377\377\377"- 00:08:14.983 [2024-11-17 04:24:53.565187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fdfd0a5d cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.983 [2024-11-17 04:24:53.565212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.983 #46 NEW cov: 12482 ft: 15297 corp: 18/335b lim: 45 exec/s: 46 rss: 73Mb L: 16/35 MS: 1 PersAutoDict- DE: "\364\377\377\377"- 00:08:14.983 [2024-11-17 04:24:53.625537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fdfd0afd cdw11:fdfd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.983 [2024-11-17 04:24:53.625562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.983 [2024-11-17 04:24:53.625615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:6e6e6e6e cdw11:6e6e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.983 [2024-11-17 04:24:53.625627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.983 #47 NEW cov: 12482 ft: 15307 corp: 19/361b lim: 45 exec/s: 47 rss: 73Mb L: 26/35 MS: 1 ChangeBit- 00:08:14.983 [2024-11-17 04:24:53.665482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fdfd0a0a cdw11:fdfd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.983 [2024-11-17 04:24:53.665507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.983 #48 NEW cov: 12482 ft: 15330 corp: 20/377b lim: 45 exec/s: 48 rss: 73Mb L: 16/35 MS: 1 CrossOver- 00:08:14.983 [2024-11-17 04:24:53.726109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58580002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.983 [2024-11-17 04:24:53.726134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.983 [2024-11-17 04:24:53.726199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:58580001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.983 [2024-11-17 04:24:53.726213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.983 [2024-11-17 04:24:53.726264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00005800 cdw11:00000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.983 [2024-11-17 04:24:53.726278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.983 [2024-11-17 04:24:53.726328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.983 [2024-11-17 04:24:53.726345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.983 #49 NEW cov: 12482 ft: 15681 corp: 21/418b lim: 45 exec/s: 49 rss: 73Mb L: 41/41 MS: 1 CopyPart- 00:08:14.983 [2024-11-17 04:24:53.785822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.983 [2024-11-17 04:24:53.785847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.243 #50 NEW cov: 12482 ft: 15684 corp: 22/428b lim: 45 exec/s: 50 rss: 73Mb L: 10/41 MS: 1 InsertByte- 00:08:15.243 [2024-11-17 04:24:53.845995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:77000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.243 [2024-11-17 04:24:53.846019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.243 #51 NEW cov: 12482 ft: 15725 corp: 23/438b lim: 45 exec/s: 51 rss: 73Mb L: 10/41 MS: 1 InsertByte- 00:08:15.243 [2024-11-17 04:24:53.886042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fdfd0afd cdw11:fdfd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.243 [2024-11-17 04:24:53.886066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.243 #52 NEW cov: 12482 ft: 15753 corp: 24/449b lim: 45 exec/s: 52 rss: 73Mb L: 11/41 MS: 1 InsertByte- 00:08:15.243 [2024-11-17 04:24:53.926220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:58580afd cdw11:58fd0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.243 [2024-11-17 04:24:53.926244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.243 #53 NEW cov: 12482 ft: 15756 corp: 25/458b lim: 45 exec/s: 53 rss: 73Mb L: 9/41 MS: 1 CrossOver- 00:08:15.243 [2024-11-17 04:24:53.986522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fdfd0afd cdw11:fdfd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.243 [2024-11-17 04:24:53.986547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.243 [2024-11-17 04:24:53.986599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:58f46e03 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.243 [2024-11-17 04:24:53.986613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.243 #54 NEW cov: 12482 ft: 15771 corp: 26/484b lim: 45 exec/s: 54 rss: 73Mb L: 26/41 MS: 1 CopyPart- 00:08:15.243 [2024-11-17 04:24:54.046736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fdfd0a5d cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.243 [2024-11-17 04:24:54.046761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.243 [2024-11-17 04:24:54.046829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.243 [2024-11-17 04:24:54.046843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.243 #55 NEW cov: 12482 ft: 15786 corp: 27/504b lim: 45 exec/s: 55 rss: 73Mb L: 20/41 MS: 1 PersAutoDict- DE: "\364\377\377\377"- 00:08:15.503 [2024-11-17 04:24:54.086827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58580002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.503 [2024-11-17 04:24:54.086853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.503 [2024-11-17 04:24:54.086908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:58582a58 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.503 [2024-11-17 04:24:54.086921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.503 #56 NEW cov: 12482 ft: 15792 corp: 28/530b lim: 45 exec/s: 56 rss: 73Mb L: 26/41 MS: 1 InsertByte- 00:08:15.503 [2024-11-17 04:24:54.127024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.503 [2024-11-17 04:24:54.127049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.503 [2024-11-17 04:24:54.127101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:03000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.503 [2024-11-17 04:24:54.127114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.503 #57 NEW cov: 12482 ft: 15859 corp: 29/554b lim: 45 exec/s: 57 rss: 74Mb L: 24/41 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\003"- 00:08:15.503 [2024-11-17 04:24:54.187022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fdfd0afd cdw11:fdfd0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.504 [2024-11-17 04:24:54.187048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.504 #58 NEW cov: 12482 ft: 15862 corp: 30/569b lim: 45 exec/s: 58 rss: 74Mb L: 15/41 MS: 1 CopyPart- 00:08:15.504 [2024-11-17 04:24:54.227185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fdfd0a5d cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.504 [2024-11-17 04:24:54.227209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.504 [2024-11-17 04:24:54.227276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.504 [2024-11-17 04:24:54.227290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.504 #59 NEW cov: 12482 ft: 15873 corp: 31/594b lim: 45 exec/s: 59 rss: 74Mb L: 25/41 MS: 1 CopyPart- 00:08:15.504 [2024-11-17 04:24:54.267506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fdfd0a5d cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.504 [2024-11-17 04:24:54.267530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.504 [2024-11-17 04:24:54.267581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.504 [2024-11-17 04:24:54.267594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.504 [2024-11-17 04:24:54.267644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:fdff5dfd cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.504 [2024-11-17 04:24:54.267658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.504 #60 NEW cov: 12482 ft: 15903 corp: 32/621b lim: 45 exec/s: 60 rss: 74Mb L: 27/41 MS: 1 CrossOver- 00:08:15.504 [2024-11-17 04:24:54.327845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58580002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.504 [2024-11-17 04:24:54.327870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.504 [2024-11-17 04:24:54.327922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:58585858 cdw11:58580002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.504 [2024-11-17 04:24:54.327939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.504 [2024-11-17 04:24:54.327988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:58585858 cdw11:58580002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.504 [2024-11-17 04:24:54.328001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.504 [2024-11-17 04:24:54.328049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:58585858 cdw11:580a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.504 [2024-11-17 04:24:54.328062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.764 #61 NEW cov: 12482 ft: 15915 corp: 33/663b lim: 45 exec/s: 61 rss: 74Mb L: 42/42 MS: 1 CopyPart- 00:08:15.764 [2024-11-17 04:24:54.367611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:58585858 cdw11:58580002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.764 [2024-11-17 04:24:54.367636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.764 [2024-11-17 04:24:54.367687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.764 [2024-11-17 04:24:54.367705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.764 #62 NEW cov: 12482 ft: 15924 corp: 34/684b lim: 45 exec/s: 62 rss: 74Mb L: 21/42 MS: 1 EraseBytes- 00:08:15.764 [2024-11-17 04:24:54.427609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.764 [2024-11-17 04:24:54.427635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.764 #63 NEW cov: 12482 ft: 15938 corp: 35/697b lim: 45 exec/s: 63 rss: 74Mb L: 13/42 MS: 1 CopyPart- 00:08:15.764 [2024-11-17 04:24:54.467737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fdfd0a5d cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.764 [2024-11-17 04:24:54.467762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.764 #64 pulse cov: 12482 ft: 15951 corp: 35/697b lim: 45 exec/s: 32 rss: 74Mb 00:08:15.764 #64 NEW cov: 12482 ft: 15951 corp: 36/713b lim: 45 exec/s: 32 rss: 74Mb L: 16/42 MS: 1 EraseBytes- 00:08:15.764 #64 DONE cov: 12482 ft: 15951 corp: 36/713b lim: 45 exec/s: 32 rss: 74Mb 00:08:15.764 ###### Recommended dictionary. ###### 00:08:15.764 "\000\000\000\000\000\000\000\003" # Uses: 2 00:08:15.764 "\364\377\377\377" # Uses: 2 00:08:15.764 ###### End of recommended dictionary. ###### 00:08:15.764 Done 64 runs in 2 second(s) 00:08:16.024 04:24:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_5.conf /var/tmp/suppress_nvmf_fuzz 00:08:16.024 04:24:54 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:16.024 04:24:54 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:16.024 04:24:54 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:16.024 04:24:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:08:16.024 04:24:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:16.025 04:24:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:16.025 04:24:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:16.025 04:24:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:08:16.025 04:24:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:16.025 04:24:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:16.025 04:24:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 6 00:08:16.025 04:24:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4406 00:08:16.025 04:24:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:16.025 04:24:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:08:16.025 04:24:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:16.025 04:24:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:16.025 04:24:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:16.025 04:24:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 00:08:16.025 [2024-11-17 04:24:54.654776] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:16.025 [2024-11-17 04:24:54.654844] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid151156 ] 00:08:16.025 [2024-11-17 04:24:54.853060] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.285 [2024-11-17 04:24:54.866070] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.285 [2024-11-17 04:24:54.918564] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:16.285 [2024-11-17 04:24:54.934907] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:08:16.285 INFO: Running with entropic power schedule (0xFF, 100). 00:08:16.285 INFO: Seed: 3650472313 00:08:16.285 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:16.285 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:16.285 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:16.285 INFO: A corpus is not provided, starting from an empty corpus 00:08:16.285 #2 INITED exec/s: 0 rss: 65Mb 00:08:16.285 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:16.285 This may also happen if the target rejected all inputs we tried so far 00:08:16.285 [2024-11-17 04:24:54.990570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:08:16.285 [2024-11-17 04:24:54.990598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.285 [2024-11-17 04:24:54.990650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:16.285 [2024-11-17 04:24:54.990664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.285 [2024-11-17 04:24:54.990719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:16.285 [2024-11-17 04:24:54.990732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.285 [2024-11-17 04:24:54.990782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:16.285 [2024-11-17 04:24:54.990796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.545 NEW_FUNC[1/714]: 0x45d2c8 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:08:16.545 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:16.545 #3 NEW cov: 12173 ft: 12152 corp: 2/9b lim: 10 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:08:16.545 [2024-11-17 04:24:55.321181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e2c cdw11:00000000 00:08:16.545 [2024-11-17 04:24:55.321224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.545 #5 NEW cov: 12286 ft: 12964 corp: 3/11b lim: 10 exec/s: 0 rss: 72Mb L: 2/8 MS: 2 ChangeByte-InsertByte- 00:08:16.545 [2024-11-17 04:24:55.361071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e7e cdw11:00000000 00:08:16.545 [2024-11-17 04:24:55.361096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.805 #7 NEW cov: 12292 ft: 13281 corp: 4/13b lim: 10 exec/s: 0 rss: 72Mb L: 2/8 MS: 2 EraseBytes-CopyPart- 00:08:16.805 [2024-11-17 04:24:55.421253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e7e cdw11:00000000 00:08:16.805 [2024-11-17 04:24:55.421278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.805 #9 NEW cov: 12377 ft: 13609 corp: 5/15b lim: 10 exec/s: 0 rss: 72Mb L: 2/8 MS: 2 EraseBytes-CopyPart- 00:08:16.805 [2024-11-17 04:24:55.461387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e7e cdw11:00000000 00:08:16.805 [2024-11-17 04:24:55.461411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.805 #10 NEW cov: 12377 ft: 13726 corp: 6/18b lim: 10 exec/s: 0 rss: 72Mb L: 3/8 MS: 1 InsertByte- 00:08:16.805 [2024-11-17 04:24:55.521753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e00 cdw11:00000000 00:08:16.805 [2024-11-17 04:24:55.521778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.805 [2024-11-17 04:24:55.521848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:16.805 [2024-11-17 04:24:55.521862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.805 [2024-11-17 04:24:55.521913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:16.805 [2024-11-17 04:24:55.521926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.805 #11 NEW cov: 12377 ft: 13947 corp: 7/25b lim: 10 exec/s: 0 rss: 72Mb L: 7/8 MS: 1 InsertRepeatedBytes- 00:08:16.805 [2024-11-17 04:24:55.581730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003a0a cdw11:00000000 00:08:16.805 [2024-11-17 04:24:55.581755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.805 #12 NEW cov: 12377 ft: 13995 corp: 8/27b lim: 10 exec/s: 0 rss: 72Mb L: 2/8 MS: 1 InsertByte- 00:08:16.805 [2024-11-17 04:24:55.621940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e2c cdw11:00000000 00:08:16.805 [2024-11-17 04:24:55.621965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.805 [2024-11-17 04:24:55.622018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007e7e cdw11:00000000 00:08:16.805 [2024-11-17 04:24:55.622032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.065 #13 NEW cov: 12377 ft: 14213 corp: 9/31b lim: 10 exec/s: 0 rss: 72Mb L: 4/8 MS: 1 CrossOver- 00:08:17.065 [2024-11-17 04:24:55.661953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003ae6 cdw11:00000000 00:08:17.065 [2024-11-17 04:24:55.661982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.065 #14 NEW cov: 12377 ft: 14251 corp: 10/33b lim: 10 exec/s: 0 rss: 72Mb L: 2/8 MS: 1 ChangeByte- 00:08:17.065 [2024-11-17 04:24:55.722415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:08:17.065 [2024-11-17 04:24:55.722440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.065 [2024-11-17 04:24:55.722492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.065 [2024-11-17 04:24:55.722506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.065 [2024-11-17 04:24:55.722556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.065 [2024-11-17 04:24:55.722568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.065 [2024-11-17 04:24:55.722621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.065 [2024-11-17 04:24:55.722633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.065 #15 NEW cov: 12377 ft: 14292 corp: 11/41b lim: 10 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 ShuffleBytes- 00:08:17.065 [2024-11-17 04:24:55.782210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e0a cdw11:00000000 00:08:17.065 [2024-11-17 04:24:55.782234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.065 #16 NEW cov: 12377 ft: 14336 corp: 12/43b lim: 10 exec/s: 0 rss: 72Mb L: 2/8 MS: 1 CrossOver- 00:08:17.065 [2024-11-17 04:24:55.822590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e00 cdw11:00000000 00:08:17.065 [2024-11-17 04:24:55.822615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.065 [2024-11-17 04:24:55.822685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000200 cdw11:00000000 00:08:17.065 [2024-11-17 04:24:55.822705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.065 [2024-11-17 04:24:55.822758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:17.065 [2024-11-17 04:24:55.822772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.065 #17 NEW cov: 12377 ft: 14393 corp: 13/50b lim: 10 exec/s: 0 rss: 73Mb L: 7/8 MS: 1 ChangeBit- 00:08:17.065 [2024-11-17 04:24:55.882899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000cff cdw11:00000000 00:08:17.065 [2024-11-17 04:24:55.882924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.065 [2024-11-17 04:24:55.882995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.065 [2024-11-17 04:24:55.883009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.065 [2024-11-17 04:24:55.883063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.066 [2024-11-17 04:24:55.883076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.066 [2024-11-17 04:24:55.883130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.066 [2024-11-17 04:24:55.883146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.326 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:17.326 #18 NEW cov: 12400 ft: 14461 corp: 14/58b lim: 10 exec/s: 0 rss: 73Mb L: 8/8 MS: 1 ChangeBinInt- 00:08:17.326 [2024-11-17 04:24:55.943039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:08:17.326 [2024-11-17 04:24:55.943065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.326 [2024-11-17 04:24:55.943135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.326 [2024-11-17 04:24:55.943151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.326 [2024-11-17 04:24:55.943205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.326 [2024-11-17 04:24:55.943220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.326 [2024-11-17 04:24:55.943274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.326 [2024-11-17 04:24:55.943288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.326 #19 NEW cov: 12400 ft: 14485 corp: 15/66b lim: 10 exec/s: 0 rss: 73Mb L: 8/8 MS: 1 CopyPart- 00:08:17.326 [2024-11-17 04:24:55.982946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003b7f cdw11:00000000 00:08:17.326 [2024-11-17 04:24:55.982971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.326 [2024-11-17 04:24:55.983040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f7f cdw11:00000000 00:08:17.326 [2024-11-17 04:24:55.983054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.326 #21 NEW cov: 12400 ft: 14512 corp: 16/71b lim: 10 exec/s: 21 rss: 73Mb L: 5/8 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:17.326 [2024-11-17 04:24:56.023270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:08:17.326 [2024-11-17 04:24:56.023295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.326 [2024-11-17 04:24:56.023364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.326 [2024-11-17 04:24:56.023378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.326 [2024-11-17 04:24:56.023431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffef cdw11:00000000 00:08:17.326 [2024-11-17 04:24:56.023445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.326 [2024-11-17 04:24:56.023496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.326 [2024-11-17 04:24:56.023509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.326 #22 NEW cov: 12400 ft: 14526 corp: 17/79b lim: 10 exec/s: 22 rss: 73Mb L: 8/8 MS: 1 ChangeBit- 00:08:17.326 [2024-11-17 04:24:56.063252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e00 cdw11:00000000 00:08:17.326 [2024-11-17 04:24:56.063278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.326 [2024-11-17 04:24:56.063335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff00 cdw11:00000000 00:08:17.326 [2024-11-17 04:24:56.063349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.326 [2024-11-17 04:24:56.063399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 00:08:17.326 [2024-11-17 04:24:56.063412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.326 #23 NEW cov: 12400 ft: 14547 corp: 18/86b lim: 10 exec/s: 23 rss: 73Mb L: 7/8 MS: 1 ChangeBinInt- 00:08:17.326 [2024-11-17 04:24:56.123241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000397e cdw11:00000000 00:08:17.326 [2024-11-17 04:24:56.123267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.326 #24 NEW cov: 12400 ft: 14570 corp: 19/88b lim: 10 exec/s: 24 rss: 73Mb L: 2/8 MS: 1 ChangeByte- 00:08:17.605 [2024-11-17 04:24:56.163673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:08:17.605 [2024-11-17 04:24:56.163702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.605 [2024-11-17 04:24:56.163756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.605 [2024-11-17 04:24:56.163770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.605 [2024-11-17 04:24:56.163821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.605 [2024-11-17 04:24:56.163851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.605 [2024-11-17 04:24:56.163902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.605 [2024-11-17 04:24:56.163915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.605 #25 NEW cov: 12400 ft: 14611 corp: 20/96b lim: 10 exec/s: 25 rss: 73Mb L: 8/8 MS: 1 ShuffleBytes- 00:08:17.605 [2024-11-17 04:24:56.223605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e7e cdw11:00000000 00:08:17.605 [2024-11-17 04:24:56.223630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.605 [2024-11-17 04:24:56.223755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000b5b5 cdw11:00000000 00:08:17.605 [2024-11-17 04:24:56.223771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.605 #26 NEW cov: 12400 ft: 14623 corp: 21/101b lim: 10 exec/s: 26 rss: 73Mb L: 5/8 MS: 1 InsertRepeatedBytes- 00:08:17.605 [2024-11-17 04:24:56.263588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e2c cdw11:00000000 00:08:17.605 [2024-11-17 04:24:56.263613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.605 #27 NEW cov: 12400 ft: 14648 corp: 22/103b lim: 10 exec/s: 27 rss: 73Mb L: 2/8 MS: 1 CopyPart- 00:08:17.605 [2024-11-17 04:24:56.303971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e00 cdw11:00000000 00:08:17.605 [2024-11-17 04:24:56.303997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.605 [2024-11-17 04:24:56.304065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff00 cdw11:00000000 00:08:17.605 [2024-11-17 04:24:56.304083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.605 [2024-11-17 04:24:56.304136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 00:08:17.605 [2024-11-17 04:24:56.304150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.605 #28 NEW cov: 12400 ft: 14670 corp: 23/110b lim: 10 exec/s: 28 rss: 73Mb L: 7/8 MS: 1 ShuffleBytes- 00:08:17.605 [2024-11-17 04:24:56.364329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e00 cdw11:00000000 00:08:17.605 [2024-11-17 04:24:56.364355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.605 [2024-11-17 04:24:56.364408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:17.605 [2024-11-17 04:24:56.364422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.605 [2024-11-17 04:24:56.364474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:17.605 [2024-11-17 04:24:56.364487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.605 [2024-11-17 04:24:56.364536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:17.605 [2024-11-17 04:24:56.364549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.605 #29 NEW cov: 12400 ft: 14697 corp: 24/119b lim: 10 exec/s: 29 rss: 73Mb L: 9/9 MS: 1 CopyPart- 00:08:17.605 [2024-11-17 04:24:56.404107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003b86 cdw11:00000000 00:08:17.605 [2024-11-17 04:24:56.404133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.605 [2024-11-17 04:24:56.404203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007f7f cdw11:00000000 00:08:17.605 [2024-11-17 04:24:56.404218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.866 #30 NEW cov: 12400 ft: 14748 corp: 25/124b lim: 10 exec/s: 30 rss: 73Mb L: 5/9 MS: 1 ChangeBinInt- 00:08:17.866 [2024-11-17 04:24:56.464307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e00 cdw11:00000000 00:08:17.866 [2024-11-17 04:24:56.464332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.866 [2024-11-17 04:24:56.464384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:17.866 [2024-11-17 04:24:56.464398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.866 #31 NEW cov: 12400 ft: 14754 corp: 26/129b lim: 10 exec/s: 31 rss: 73Mb L: 5/9 MS: 1 EraseBytes- 00:08:17.866 [2024-11-17 04:24:56.504327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003a31 cdw11:00000000 00:08:17.866 [2024-11-17 04:24:56.504351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.866 #32 NEW cov: 12400 ft: 14767 corp: 27/131b lim: 10 exec/s: 32 rss: 73Mb L: 2/9 MS: 1 ChangeByte- 00:08:17.866 [2024-11-17 04:24:56.544816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e00 cdw11:00000000 00:08:17.866 [2024-11-17 04:24:56.544841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.866 [2024-11-17 04:24:56.544892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:17.866 [2024-11-17 04:24:56.544908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.866 [2024-11-17 04:24:56.544956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:17.866 [2024-11-17 04:24:56.544969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.866 [2024-11-17 04:24:56.545018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:17.866 [2024-11-17 04:24:56.545031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.866 #33 NEW cov: 12400 ft: 14784 corp: 28/140b lim: 10 exec/s: 33 rss: 73Mb L: 9/9 MS: 1 ShuffleBytes- 00:08:17.866 [2024-11-17 04:24:56.604869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000760a cdw11:00000000 00:08:17.866 [2024-11-17 04:24:56.604894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.866 #34 NEW cov: 12400 ft: 14817 corp: 29/142b lim: 10 exec/s: 34 rss: 73Mb L: 2/9 MS: 1 ChangeBit- 00:08:17.866 [2024-11-17 04:24:56.664808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:17.866 [2024-11-17 04:24:56.664833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.866 #36 NEW cov: 12400 ft: 14834 corp: 30/145b lim: 10 exec/s: 36 rss: 74Mb L: 3/9 MS: 2 EraseBytes-CrossOver- 00:08:18.127 [2024-11-17 04:24:56.705029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e7e cdw11:00000000 00:08:18.127 [2024-11-17 04:24:56.705053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.127 [2024-11-17 04:24:56.705104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000b5bd cdw11:00000000 00:08:18.127 [2024-11-17 04:24:56.705118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.127 #42 NEW cov: 12400 ft: 14845 corp: 31/150b lim: 10 exec/s: 42 rss: 74Mb L: 5/9 MS: 1 ChangeBit- 00:08:18.127 [2024-11-17 04:24:56.765286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e11 cdw11:00000000 00:08:18.127 [2024-11-17 04:24:56.765311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.127 [2024-11-17 04:24:56.765376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff00 cdw11:00000000 00:08:18.127 [2024-11-17 04:24:56.765390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.127 [2024-11-17 04:24:56.765440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 00:08:18.127 [2024-11-17 04:24:56.765452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.127 #43 NEW cov: 12400 ft: 14858 corp: 32/157b lim: 10 exec/s: 43 rss: 74Mb L: 7/9 MS: 1 ChangeByte- 00:08:18.127 [2024-11-17 04:24:56.805393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e2c cdw11:00000000 00:08:18.127 [2024-11-17 04:24:56.805418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.127 [2024-11-17 04:24:56.805468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000cff cdw11:00000000 00:08:18.127 [2024-11-17 04:24:56.805481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.127 [2024-11-17 04:24:56.805533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:18.127 [2024-11-17 04:24:56.805546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.127 #44 NEW cov: 12400 ft: 14867 corp: 33/163b lim: 10 exec/s: 44 rss: 74Mb L: 6/9 MS: 1 CrossOver- 00:08:18.127 [2024-11-17 04:24:56.865556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e01 cdw11:00000000 00:08:18.127 [2024-11-17 04:24:56.865581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.127 [2024-11-17 04:24:56.865632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff00 cdw11:00000000 00:08:18.127 [2024-11-17 04:24:56.865645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.127 [2024-11-17 04:24:56.865698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 00:08:18.127 [2024-11-17 04:24:56.865711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.127 #45 NEW cov: 12400 ft: 14872 corp: 34/170b lim: 10 exec/s: 45 rss: 74Mb L: 7/9 MS: 1 ChangeBit- 00:08:18.127 [2024-11-17 04:24:56.925716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007e00 cdw11:00000000 00:08:18.127 [2024-11-17 04:24:56.925741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.127 [2024-11-17 04:24:56.925791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff00 cdw11:00000000 00:08:18.127 [2024-11-17 04:24:56.925804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.127 [2024-11-17 04:24:56.925853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 00:08:18.127 [2024-11-17 04:24:56.925866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.127 #46 NEW cov: 12400 ft: 14886 corp: 35/177b lim: 10 exec/s: 46 rss: 74Mb L: 7/9 MS: 1 ChangeBit- 00:08:18.387 [2024-11-17 04:24:56.965700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000fe7e cdw11:00000000 00:08:18.387 [2024-11-17 04:24:56.965726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.387 [2024-11-17 04:24:56.965776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000b5b5 cdw11:00000000 00:08:18.387 [2024-11-17 04:24:56.965790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.387 #47 NEW cov: 12400 ft: 14887 corp: 36/182b lim: 10 exec/s: 23 rss: 74Mb L: 5/9 MS: 1 ChangeBit- 00:08:18.387 #47 DONE cov: 12400 ft: 14887 corp: 36/182b lim: 10 exec/s: 23 rss: 74Mb 00:08:18.387 Done 47 runs in 2 second(s) 00:08:18.387 04:24:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_6.conf /var/tmp/suppress_nvmf_fuzz 00:08:18.387 04:24:57 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:18.387 04:24:57 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:18.387 04:24:57 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:08:18.388 04:24:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:08:18.388 04:24:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:18.388 04:24:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:18.388 04:24:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:18.388 04:24:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:08:18.388 04:24:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:18.388 04:24:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:18.388 04:24:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 7 00:08:18.388 04:24:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4407 00:08:18.388 04:24:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:18.388 04:24:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:08:18.388 04:24:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:18.388 04:24:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:18.388 04:24:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:18.388 04:24:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 00:08:18.388 [2024-11-17 04:24:57.130582] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:18.388 [2024-11-17 04:24:57.130652] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid151616 ] 00:08:18.647 [2024-11-17 04:24:57.331736] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.647 [2024-11-17 04:24:57.344845] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.647 [2024-11-17 04:24:57.397401] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:18.647 [2024-11-17 04:24:57.413746] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:08:18.647 INFO: Running with entropic power schedule (0xFF, 100). 00:08:18.647 INFO: Seed: 1833527761 00:08:18.647 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:18.647 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:18.647 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:18.647 INFO: A corpus is not provided, starting from an empty corpus 00:08:18.647 #2 INITED exec/s: 0 rss: 65Mb 00:08:18.647 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:18.647 This may also happen if the target rejected all inputs we tried so far 00:08:18.648 [2024-11-17 04:24:57.458481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002daa cdw11:00000000 00:08:18.648 [2024-11-17 04:24:57.458515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.167 NEW_FUNC[1/714]: 0x45dcc8 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:08:19.167 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:19.167 #5 NEW cov: 12173 ft: 12172 corp: 2/3b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 3 ShuffleBytes-ChangeByte-InsertByte- 00:08:19.167 [2024-11-17 04:24:57.829494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002d26 cdw11:00000000 00:08:19.167 [2024-11-17 04:24:57.829534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.167 [2024-11-17 04:24:57.829580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002626 cdw11:00000000 00:08:19.167 [2024-11-17 04:24:57.829599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.167 [2024-11-17 04:24:57.829627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002626 cdw11:00000000 00:08:19.167 [2024-11-17 04:24:57.829642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.167 #6 NEW cov: 12286 ft: 13023 corp: 3/10b lim: 10 exec/s: 0 rss: 72Mb L: 7/7 MS: 1 InsertRepeatedBytes- 00:08:19.167 [2024-11-17 04:24:57.919527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002daa cdw11:00000000 00:08:19.167 [2024-11-17 04:24:57.919558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.167 #7 NEW cov: 12292 ft: 13341 corp: 4/13b lim: 10 exec/s: 0 rss: 72Mb L: 3/7 MS: 1 InsertByte- 00:08:19.167 [2024-11-17 04:24:57.979733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002d26 cdw11:00000000 00:08:19.167 [2024-11-17 04:24:57.979762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.167 [2024-11-17 04:24:57.979807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002626 cdw11:00000000 00:08:19.167 [2024-11-17 04:24:57.979822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.167 [2024-11-17 04:24:57.979849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002626 cdw11:00000000 00:08:19.167 [2024-11-17 04:24:57.979864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.428 #8 NEW cov: 12377 ft: 13680 corp: 5/20b lim: 10 exec/s: 0 rss: 72Mb L: 7/7 MS: 1 ShuffleBytes- 00:08:19.428 [2024-11-17 04:24:58.069909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002c2d cdw11:00000000 00:08:19.428 [2024-11-17 04:24:58.069940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.428 #9 NEW cov: 12377 ft: 13780 corp: 6/23b lim: 10 exec/s: 0 rss: 72Mb L: 3/7 MS: 1 InsertByte- 00:08:19.428 [2024-11-17 04:24:58.120134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002d0a cdw11:00000000 00:08:19.428 [2024-11-17 04:24:58.120163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.428 [2024-11-17 04:24:58.120208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002626 cdw11:00000000 00:08:19.428 [2024-11-17 04:24:58.120223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.428 [2024-11-17 04:24:58.120250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002626 cdw11:00000000 00:08:19.428 [2024-11-17 04:24:58.120265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.428 #10 NEW cov: 12377 ft: 13842 corp: 7/30b lim: 10 exec/s: 0 rss: 72Mb L: 7/7 MS: 1 CrossOver- 00:08:19.428 [2024-11-17 04:24:58.210405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002d26 cdw11:00000000 00:08:19.428 [2024-11-17 04:24:58.210434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.428 [2024-11-17 04:24:58.210478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002626 cdw11:00000000 00:08:19.428 [2024-11-17 04:24:58.210494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.428 [2024-11-17 04:24:58.210526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002626 cdw11:00000000 00:08:19.428 [2024-11-17 04:24:58.210541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.428 [2024-11-17 04:24:58.210567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000aa2d cdw11:00000000 00:08:19.428 [2024-11-17 04:24:58.210582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.428 #11 NEW cov: 12377 ft: 14169 corp: 8/39b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 CrossOver- 00:08:19.689 [2024-11-17 04:24:58.270451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002daa cdw11:00000000 00:08:19.689 [2024-11-17 04:24:58.270482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.689 #12 NEW cov: 12377 ft: 14241 corp: 9/42b lim: 10 exec/s: 0 rss: 72Mb L: 3/9 MS: 1 ChangeBit- 00:08:19.689 [2024-11-17 04:24:58.360807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002d0a cdw11:00000000 00:08:19.689 [2024-11-17 04:24:58.360837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.689 [2024-11-17 04:24:58.360883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00003026 cdw11:00000000 00:08:19.689 [2024-11-17 04:24:58.360898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.689 [2024-11-17 04:24:58.360925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002626 cdw11:00000000 00:08:19.689 [2024-11-17 04:24:58.360941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.689 [2024-11-17 04:24:58.360968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:000026aa cdw11:00000000 00:08:19.689 [2024-11-17 04:24:58.360983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.689 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:19.689 #13 NEW cov: 12400 ft: 14284 corp: 10/50b lim: 10 exec/s: 0 rss: 73Mb L: 8/9 MS: 1 InsertByte- 00:08:19.689 [2024-11-17 04:24:58.451009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002d0a cdw11:00000000 00:08:19.689 [2024-11-17 04:24:58.451038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.689 [2024-11-17 04:24:58.451083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00003001 cdw11:00000000 00:08:19.689 [2024-11-17 04:24:58.451099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.689 [2024-11-17 04:24:58.451125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:19.689 [2024-11-17 04:24:58.451140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.689 [2024-11-17 04:24:58.451166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:000004aa cdw11:00000000 00:08:19.689 [2024-11-17 04:24:58.451181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.950 #14 NEW cov: 12400 ft: 14320 corp: 11/58b lim: 10 exec/s: 14 rss: 73Mb L: 8/9 MS: 1 CMP- DE: "\001\000\000\004"- 00:08:19.950 [2024-11-17 04:24:58.541109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002daa cdw11:00000000 00:08:19.950 [2024-11-17 04:24:58.541139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.950 #15 NEW cov: 12400 ft: 14335 corp: 12/60b lim: 10 exec/s: 15 rss: 73Mb L: 2/9 MS: 1 ShuffleBytes- 00:08:19.950 [2024-11-17 04:24:58.591269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d72d cdw11:00000000 00:08:19.950 [2024-11-17 04:24:58.591299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.950 #16 NEW cov: 12400 ft: 14460 corp: 13/63b lim: 10 exec/s: 16 rss: 73Mb L: 3/9 MS: 1 ChangeByte- 00:08:19.950 [2024-11-17 04:24:58.681602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002d0a cdw11:00000000 00:08:19.950 [2024-11-17 04:24:58.681633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.950 [2024-11-17 04:24:58.681664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002626 cdw11:00000000 00:08:19.950 [2024-11-17 04:24:58.681679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.950 [2024-11-17 04:24:58.681714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002626 cdw11:00000000 00:08:19.950 [2024-11-17 04:24:58.681729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.950 #17 NEW cov: 12400 ft: 14493 corp: 14/70b lim: 10 exec/s: 17 rss: 73Mb L: 7/9 MS: 1 ShuffleBytes- 00:08:19.950 [2024-11-17 04:24:58.741673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002daa cdw11:00000000 00:08:19.950 [2024-11-17 04:24:58.741709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.210 #18 NEW cov: 12400 ft: 14524 corp: 15/73b lim: 10 exec/s: 18 rss: 73Mb L: 3/9 MS: 1 CopyPart- 00:08:20.210 [2024-11-17 04:24:58.831924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000caaa cdw11:00000000 00:08:20.210 [2024-11-17 04:24:58.831953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.210 #19 NEW cov: 12400 ft: 14594 corp: 16/75b lim: 10 exec/s: 19 rss: 73Mb L: 2/9 MS: 1 ChangeByte- 00:08:20.210 [2024-11-17 04:24:58.922155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000daaa cdw11:00000000 00:08:20.210 [2024-11-17 04:24:58.922185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.210 #20 NEW cov: 12400 ft: 14639 corp: 17/77b lim: 10 exec/s: 20 rss: 73Mb L: 2/9 MS: 1 ChangeBit- 00:08:20.210 [2024-11-17 04:24:59.012351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000040aa cdw11:00000000 00:08:20.210 [2024-11-17 04:24:59.012379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.476 #21 NEW cov: 12400 ft: 14649 corp: 18/79b lim: 10 exec/s: 21 rss: 73Mb L: 2/9 MS: 1 ChangeByte- 00:08:20.476 [2024-11-17 04:24:59.062621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a2d cdw11:00000000 00:08:20.477 [2024-11-17 04:24:59.062661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.477 [2024-11-17 04:24:59.062714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002626 cdw11:00000000 00:08:20.477 [2024-11-17 04:24:59.062730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.477 [2024-11-17 04:24:59.062757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002626 cdw11:00000000 00:08:20.477 [2024-11-17 04:24:59.062772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.477 #22 NEW cov: 12400 ft: 14709 corp: 19/86b lim: 10 exec/s: 22 rss: 73Mb L: 7/9 MS: 1 ShuffleBytes- 00:08:20.477 [2024-11-17 04:24:59.112865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002d26 cdw11:00000000 00:08:20.477 [2024-11-17 04:24:59.112894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.477 [2024-11-17 04:24:59.112939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002626 cdw11:00000000 00:08:20.477 [2024-11-17 04:24:59.112954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.477 [2024-11-17 04:24:59.112981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002626 cdw11:00000000 00:08:20.477 [2024-11-17 04:24:59.112996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.477 [2024-11-17 04:24:59.113022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000aa26 cdw11:00000000 00:08:20.477 [2024-11-17 04:24:59.113037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.477 [2024-11-17 04:24:59.113062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00002626 cdw11:00000000 00:08:20.477 [2024-11-17 04:24:59.113077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:20.477 #23 NEW cov: 12400 ft: 14789 corp: 20/96b lim: 10 exec/s: 23 rss: 73Mb L: 10/10 MS: 1 CopyPart- 00:08:20.477 [2024-11-17 04:24:59.172781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000daa2 cdw11:00000000 00:08:20.477 [2024-11-17 04:24:59.172811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.477 #24 NEW cov: 12400 ft: 14811 corp: 21/98b lim: 10 exec/s: 24 rss: 73Mb L: 2/10 MS: 1 ChangeBit- 00:08:20.477 [2024-11-17 04:24:59.263195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002d0a cdw11:00000000 00:08:20.477 [2024-11-17 04:24:59.263225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.477 [2024-11-17 04:24:59.263269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00003001 cdw11:00000000 00:08:20.477 [2024-11-17 04:24:59.263284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.477 [2024-11-17 04:24:59.263310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:20.477 [2024-11-17 04:24:59.263325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.477 [2024-11-17 04:24:59.263351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:000004aa cdw11:00000000 00:08:20.477 [2024-11-17 04:24:59.263366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.737 #25 NEW cov: 12400 ft: 14825 corp: 22/106b lim: 10 exec/s: 25 rss: 73Mb L: 8/10 MS: 1 CrossOver- 00:08:20.737 [2024-11-17 04:24:59.353250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d728 cdw11:00000000 00:08:20.737 [2024-11-17 04:24:59.353281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.737 #26 NEW cov: 12400 ft: 14840 corp: 23/109b lim: 10 exec/s: 26 rss: 74Mb L: 3/10 MS: 1 ChangeBinInt- 00:08:20.737 [2024-11-17 04:24:59.443584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:20.737 [2024-11-17 04:24:59.443618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.737 [2024-11-17 04:24:59.443663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:20.737 [2024-11-17 04:24:59.443678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.737 [2024-11-17 04:24:59.443712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002c2d cdw11:00000000 00:08:20.737 [2024-11-17 04:24:59.443728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.737 #27 NEW cov: 12400 ft: 14880 corp: 24/116b lim: 10 exec/s: 13 rss: 74Mb L: 7/10 MS: 1 InsertRepeatedBytes- 00:08:20.737 #27 DONE cov: 12400 ft: 14880 corp: 24/116b lim: 10 exec/s: 13 rss: 74Mb 00:08:20.737 ###### Recommended dictionary. ###### 00:08:20.737 "\001\000\000\004" # Uses: 0 00:08:20.737 ###### End of recommended dictionary. ###### 00:08:20.737 Done 27 runs in 2 second(s) 00:08:20.998 04:24:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_7.conf /var/tmp/suppress_nvmf_fuzz 00:08:20.998 04:24:59 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:20.998 04:24:59 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:20.998 04:24:59 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:08:20.998 04:24:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:08:20.998 04:24:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:20.998 04:24:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:20.998 04:24:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:20.998 04:24:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:08:20.998 04:24:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:20.998 04:24:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:20.998 04:24:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 8 00:08:20.998 04:24:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4408 00:08:20.998 04:24:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:20.998 04:24:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:08:20.998 04:24:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:20.998 04:24:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:20.998 04:24:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:20.998 04:24:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 00:08:20.998 [2024-11-17 04:24:59.628140] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:20.998 [2024-11-17 04:24:59.628207] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid152121 ] 00:08:21.258 [2024-11-17 04:24:59.842150] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.258 [2024-11-17 04:24:59.854912] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.258 [2024-11-17 04:24:59.907243] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:21.258 [2024-11-17 04:24:59.923565] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:08:21.258 INFO: Running with entropic power schedule (0xFF, 100). 00:08:21.258 INFO: Seed: 49555145 00:08:21.258 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:21.258 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:21.258 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:21.258 INFO: A corpus is not provided, starting from an empty corpus 00:08:21.259 [2024-11-17 04:24:59.968406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.259 [2024-11-17 04:24:59.968442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.259 #2 INITED cov: 12201 ft: 12200 corp: 1/1b exec/s: 0 rss: 70Mb 00:08:21.259 [2024-11-17 04:25:00.018416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.259 [2024-11-17 04:25:00.018449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.519 #3 NEW cov: 12314 ft: 12928 corp: 2/2b lim: 5 exec/s: 0 rss: 70Mb L: 1/1 MS: 1 ShuffleBytes- 00:08:21.519 [2024-11-17 04:25:00.108670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.519 [2024-11-17 04:25:00.108716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.519 #4 NEW cov: 12320 ft: 13136 corp: 3/3b lim: 5 exec/s: 0 rss: 70Mb L: 1/1 MS: 1 ChangeByte- 00:08:21.519 [2024-11-17 04:25:00.158829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.519 [2024-11-17 04:25:00.158860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.519 [2024-11-17 04:25:00.158893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.519 [2024-11-17 04:25:00.158910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.519 #5 NEW cov: 12405 ft: 14028 corp: 4/5b lim: 5 exec/s: 0 rss: 70Mb L: 2/2 MS: 1 InsertByte- 00:08:21.519 [2024-11-17 04:25:00.249060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.519 [2024-11-17 04:25:00.249092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.519 [2024-11-17 04:25:00.249126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.519 [2024-11-17 04:25:00.249142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.519 #6 NEW cov: 12405 ft: 14120 corp: 5/7b lim: 5 exec/s: 0 rss: 70Mb L: 2/2 MS: 1 ChangeByte- 00:08:21.519 [2024-11-17 04:25:00.339202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.519 [2024-11-17 04:25:00.339233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.778 #7 NEW cov: 12405 ft: 14169 corp: 6/8b lim: 5 exec/s: 0 rss: 70Mb L: 1/2 MS: 1 EraseBytes- 00:08:21.778 [2024-11-17 04:25:00.399443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.778 [2024-11-17 04:25:00.399479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.778 [2024-11-17 04:25:00.399513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.778 [2024-11-17 04:25:00.399529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.778 #8 NEW cov: 12405 ft: 14292 corp: 7/10b lim: 5 exec/s: 0 rss: 70Mb L: 2/2 MS: 1 ShuffleBytes- 00:08:21.778 [2024-11-17 04:25:00.489699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.778 [2024-11-17 04:25:00.489743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.778 [2024-11-17 04:25:00.489777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.778 [2024-11-17 04:25:00.489793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.778 [2024-11-17 04:25:00.489821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.778 [2024-11-17 04:25:00.489837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.778 #9 NEW cov: 12405 ft: 14542 corp: 8/13b lim: 5 exec/s: 0 rss: 70Mb L: 3/3 MS: 1 CrossOver- 00:08:21.778 [2024-11-17 04:25:00.549833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.778 [2024-11-17 04:25:00.549864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.778 [2024-11-17 04:25:00.549897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.778 [2024-11-17 04:25:00.549913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.778 #10 NEW cov: 12405 ft: 14588 corp: 9/15b lim: 5 exec/s: 0 rss: 70Mb L: 2/3 MS: 1 ChangeBit- 00:08:21.778 [2024-11-17 04:25:00.600033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.778 [2024-11-17 04:25:00.600063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.778 [2024-11-17 04:25:00.600097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.778 [2024-11-17 04:25:00.600113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.778 [2024-11-17 04:25:00.600142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.778 [2024-11-17 04:25:00.600157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.038 #11 NEW cov: 12405 ft: 14668 corp: 10/18b lim: 5 exec/s: 0 rss: 70Mb L: 3/3 MS: 1 InsertByte- 00:08:22.038 [2024-11-17 04:25:00.660095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-11-17 04:25:00.660124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.038 [2024-11-17 04:25:00.660176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-11-17 04:25:00.660192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.038 #12 NEW cov: 12405 ft: 14699 corp: 11/20b lim: 5 exec/s: 0 rss: 70Mb L: 2/3 MS: 1 ChangeBit- 00:08:22.038 [2024-11-17 04:25:00.720353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-11-17 04:25:00.720384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.038 [2024-11-17 04:25:00.720417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-11-17 04:25:00.720433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.038 [2024-11-17 04:25:00.720462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-11-17 04:25:00.720478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.038 #13 NEW cov: 12405 ft: 14722 corp: 12/23b lim: 5 exec/s: 0 rss: 70Mb L: 3/3 MS: 1 ChangeBit- 00:08:22.038 [2024-11-17 04:25:00.810668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-11-17 04:25:00.810707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.038 [2024-11-17 04:25:00.810742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-11-17 04:25:00.810758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.038 [2024-11-17 04:25:00.810788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-11-17 04:25:00.810803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.038 [2024-11-17 04:25:00.810832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-11-17 04:25:00.810848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.558 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:22.558 #14 NEW cov: 12428 ft: 15047 corp: 13/27b lim: 5 exec/s: 14 rss: 72Mb L: 4/4 MS: 1 CrossOver- 00:08:22.558 [2024-11-17 04:25:01.182203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.558 [2024-11-17 04:25:01.182237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.558 [2024-11-17 04:25:01.182306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.558 [2024-11-17 04:25:01.182320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.558 #15 NEW cov: 12428 ft: 15185 corp: 14/29b lim: 5 exec/s: 15 rss: 72Mb L: 2/4 MS: 1 EraseBytes- 00:08:22.558 [2024-11-17 04:25:01.242279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.558 [2024-11-17 04:25:01.242308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.558 [2024-11-17 04:25:01.242365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.558 [2024-11-17 04:25:01.242378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.558 #16 NEW cov: 12428 ft: 15239 corp: 15/31b lim: 5 exec/s: 16 rss: 72Mb L: 2/4 MS: 1 ChangeBit- 00:08:22.558 [2024-11-17 04:25:01.302465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.558 [2024-11-17 04:25:01.302489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.558 [2024-11-17 04:25:01.302560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.558 [2024-11-17 04:25:01.302574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.558 #17 NEW cov: 12428 ft: 15311 corp: 16/33b lim: 5 exec/s: 17 rss: 72Mb L: 2/4 MS: 1 ChangeBinInt- 00:08:22.558 [2024-11-17 04:25:01.362592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.558 [2024-11-17 04:25:01.362618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.558 [2024-11-17 04:25:01.362673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.558 [2024-11-17 04:25:01.362686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.818 #18 NEW cov: 12428 ft: 15392 corp: 17/35b lim: 5 exec/s: 18 rss: 72Mb L: 2/4 MS: 1 ChangeBit- 00:08:22.818 [2024-11-17 04:25:01.423271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.818 [2024-11-17 04:25:01.423297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.818 [2024-11-17 04:25:01.423353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.818 [2024-11-17 04:25:01.423367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.818 [2024-11-17 04:25:01.423421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.819 [2024-11-17 04:25:01.423434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.819 [2024-11-17 04:25:01.423486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.819 [2024-11-17 04:25:01.423499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.819 [2024-11-17 04:25:01.423553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.819 [2024-11-17 04:25:01.423567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:22.819 #19 NEW cov: 12428 ft: 15464 corp: 18/40b lim: 5 exec/s: 19 rss: 72Mb L: 5/5 MS: 1 InsertByte- 00:08:22.819 [2024-11-17 04:25:01.462909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.819 [2024-11-17 04:25:01.462933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.819 [2024-11-17 04:25:01.463005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.819 [2024-11-17 04:25:01.463019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.819 #20 NEW cov: 12428 ft: 15492 corp: 19/42b lim: 5 exec/s: 20 rss: 72Mb L: 2/5 MS: 1 ChangeBit- 00:08:22.819 [2024-11-17 04:25:01.523192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.819 [2024-11-17 04:25:01.523217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.819 [2024-11-17 04:25:01.523288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.819 [2024-11-17 04:25:01.523302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.819 [2024-11-17 04:25:01.523355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.819 [2024-11-17 04:25:01.523368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.819 #21 NEW cov: 12428 ft: 15512 corp: 20/45b lim: 5 exec/s: 21 rss: 72Mb L: 3/5 MS: 1 CopyPart- 00:08:22.819 [2024-11-17 04:25:01.583072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.819 [2024-11-17 04:25:01.583097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.819 #22 NEW cov: 12428 ft: 15566 corp: 21/46b lim: 5 exec/s: 22 rss: 72Mb L: 1/5 MS: 1 CrossOver- 00:08:22.819 [2024-11-17 04:25:01.623331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.819 [2024-11-17 04:25:01.623355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.819 [2024-11-17 04:25:01.623427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.819 [2024-11-17 04:25:01.623440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.819 #23 NEW cov: 12428 ft: 15629 corp: 22/48b lim: 5 exec/s: 23 rss: 72Mb L: 2/5 MS: 1 ChangeBit- 00:08:23.080 [2024-11-17 04:25:01.663574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.080 [2024-11-17 04:25:01.663599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.080 [2024-11-17 04:25:01.663668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.080 [2024-11-17 04:25:01.663682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.080 [2024-11-17 04:25:01.663736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.080 [2024-11-17 04:25:01.663752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.080 #24 NEW cov: 12428 ft: 15640 corp: 23/51b lim: 5 exec/s: 24 rss: 72Mb L: 3/5 MS: 1 ChangeByte- 00:08:23.080 [2024-11-17 04:25:01.703579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.080 [2024-11-17 04:25:01.703604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.080 [2024-11-17 04:25:01.703659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.080 [2024-11-17 04:25:01.703673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.080 #25 NEW cov: 12428 ft: 15670 corp: 24/53b lim: 5 exec/s: 25 rss: 72Mb L: 2/5 MS: 1 CopyPart- 00:08:23.080 [2024-11-17 04:25:01.743699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.080 [2024-11-17 04:25:01.743724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.080 [2024-11-17 04:25:01.743796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.080 [2024-11-17 04:25:01.743810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.080 #26 NEW cov: 12428 ft: 15679 corp: 25/55b lim: 5 exec/s: 26 rss: 72Mb L: 2/5 MS: 1 ChangeBit- 00:08:23.080 [2024-11-17 04:25:01.804146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.080 [2024-11-17 04:25:01.804170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.080 [2024-11-17 04:25:01.804227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.080 [2024-11-17 04:25:01.804242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.080 [2024-11-17 04:25:01.804295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.080 [2024-11-17 04:25:01.804308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.080 [2024-11-17 04:25:01.804362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.080 [2024-11-17 04:25:01.804375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.080 #27 NEW cov: 12428 ft: 15732 corp: 26/59b lim: 5 exec/s: 27 rss: 72Mb L: 4/5 MS: 1 CrossOver- 00:08:23.080 [2024-11-17 04:25:01.864000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.080 [2024-11-17 04:25:01.864024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.080 [2024-11-17 04:25:01.864077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.080 [2024-11-17 04:25:01.864091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.080 #28 NEW cov: 12428 ft: 15741 corp: 27/61b lim: 5 exec/s: 28 rss: 73Mb L: 2/5 MS: 1 ChangeByte- 00:08:23.340 [2024-11-17 04:25:01.924017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.340 [2024-11-17 04:25:01.924042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.340 #29 NEW cov: 12428 ft: 15764 corp: 28/62b lim: 5 exec/s: 14 rss: 73Mb L: 1/5 MS: 1 ChangeByte- 00:08:23.340 #29 DONE cov: 12428 ft: 15764 corp: 28/62b lim: 5 exec/s: 14 rss: 73Mb 00:08:23.340 Done 29 runs in 2 second(s) 00:08:23.340 04:25:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_8.conf /var/tmp/suppress_nvmf_fuzz 00:08:23.340 04:25:02 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:23.340 04:25:02 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.340 04:25:02 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:08:23.340 04:25:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:08:23.340 04:25:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:23.340 04:25:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:23.340 04:25:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:23.340 04:25:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:08:23.340 04:25:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:23.340 04:25:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:23.340 04:25:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 9 00:08:23.340 04:25:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4409 00:08:23.340 04:25:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:23.340 04:25:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:08:23.340 04:25:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:23.340 04:25:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:23.340 04:25:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:23.341 04:25:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 00:08:23.341 [2024-11-17 04:25:02.107878] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:23.341 [2024-11-17 04:25:02.107950] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid152437 ] 00:08:23.601 [2024-11-17 04:25:02.308427] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.601 [2024-11-17 04:25:02.320907] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.601 [2024-11-17 04:25:02.373348] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:23.601 [2024-11-17 04:25:02.389678] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:08:23.601 INFO: Running with entropic power schedule (0xFF, 100). 00:08:23.601 INFO: Seed: 2515551976 00:08:23.860 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:23.860 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:23.860 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:23.860 INFO: A corpus is not provided, starting from an empty corpus 00:08:23.860 [2024-11-17 04:25:02.465943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.860 [2024-11-17 04:25:02.465981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.860 #2 INITED cov: 12201 ft: 12192 corp: 1/1b exec/s: 0 rss: 71Mb 00:08:23.860 [2024-11-17 04:25:02.516348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.860 [2024-11-17 04:25:02.516378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.860 [2024-11-17 04:25:02.516515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.860 [2024-11-17 04:25:02.516532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.860 #3 NEW cov: 12314 ft: 13636 corp: 2/3b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 InsertByte- 00:08:23.860 [2024-11-17 04:25:02.586860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.860 [2024-11-17 04:25:02.586888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.860 [2024-11-17 04:25:02.587018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.860 [2024-11-17 04:25:02.587038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.860 [2024-11-17 04:25:02.587167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.860 [2024-11-17 04:25:02.587184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.860 #4 NEW cov: 12320 ft: 14001 corp: 3/6b lim: 5 exec/s: 0 rss: 71Mb L: 3/3 MS: 1 CrossOver- 00:08:23.860 [2024-11-17 04:25:02.657039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.860 [2024-11-17 04:25:02.657068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.860 [2024-11-17 04:25:02.657195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.860 [2024-11-17 04:25:02.657213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.860 [2024-11-17 04:25:02.657337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.860 [2024-11-17 04:25:02.657354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.120 #5 NEW cov: 12405 ft: 14203 corp: 4/9b lim: 5 exec/s: 0 rss: 71Mb L: 3/3 MS: 1 CopyPart- 00:08:24.120 [2024-11-17 04:25:02.727331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.120 [2024-11-17 04:25:02.727360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.120 [2024-11-17 04:25:02.727491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.120 [2024-11-17 04:25:02.727512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.120 [2024-11-17 04:25:02.727641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.120 [2024-11-17 04:25:02.727660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.120 #6 NEW cov: 12405 ft: 14240 corp: 5/12b lim: 5 exec/s: 0 rss: 71Mb L: 3/3 MS: 1 CrossOver- 00:08:24.120 [2024-11-17 04:25:02.797459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.120 [2024-11-17 04:25:02.797486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.120 [2024-11-17 04:25:02.797603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.120 [2024-11-17 04:25:02.797619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.120 [2024-11-17 04:25:02.797733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.120 [2024-11-17 04:25:02.797750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.120 #7 NEW cov: 12405 ft: 14282 corp: 6/15b lim: 5 exec/s: 0 rss: 71Mb L: 3/3 MS: 1 ChangeBit- 00:08:24.120 [2024-11-17 04:25:02.867443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.120 [2024-11-17 04:25:02.867470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.120 [2024-11-17 04:25:02.867599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.120 [2024-11-17 04:25:02.867616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.120 #8 NEW cov: 12405 ft: 14438 corp: 7/17b lim: 5 exec/s: 0 rss: 72Mb L: 2/3 MS: 1 EraseBytes- 00:08:24.120 [2024-11-17 04:25:02.937299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.120 [2024-11-17 04:25:02.937326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.379 #9 NEW cov: 12405 ft: 14500 corp: 8/18b lim: 5 exec/s: 0 rss: 72Mb L: 1/3 MS: 1 ChangeByte- 00:08:24.379 [2024-11-17 04:25:02.987663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.379 [2024-11-17 04:25:02.987689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.379 [2024-11-17 04:25:02.987822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.380 [2024-11-17 04:25:02.987840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.380 #10 NEW cov: 12405 ft: 14555 corp: 9/20b lim: 5 exec/s: 0 rss: 72Mb L: 2/3 MS: 1 EraseBytes- 00:08:24.380 [2024-11-17 04:25:03.037584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.380 [2024-11-17 04:25:03.037617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.380 #11 NEW cov: 12405 ft: 14618 corp: 10/21b lim: 5 exec/s: 0 rss: 72Mb L: 1/3 MS: 1 ShuffleBytes- 00:08:24.380 [2024-11-17 04:25:03.108096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.380 [2024-11-17 04:25:03.108124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.380 [2024-11-17 04:25:03.108256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.380 [2024-11-17 04:25:03.108273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.380 #12 NEW cov: 12405 ft: 14642 corp: 11/23b lim: 5 exec/s: 0 rss: 72Mb L: 2/3 MS: 1 CrossOver- 00:08:24.380 [2024-11-17 04:25:03.158269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.380 [2024-11-17 04:25:03.158298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.380 [2024-11-17 04:25:03.158431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.380 [2024-11-17 04:25:03.158449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.380 #13 NEW cov: 12405 ft: 14671 corp: 12/25b lim: 5 exec/s: 0 rss: 72Mb L: 2/3 MS: 1 InsertByte- 00:08:24.380 [2024-11-17 04:25:03.208619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.380 [2024-11-17 04:25:03.208651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.380 [2024-11-17 04:25:03.208776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.380 [2024-11-17 04:25:03.208793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.639 #14 NEW cov: 12405 ft: 14732 corp: 13/27b lim: 5 exec/s: 0 rss: 72Mb L: 2/3 MS: 1 ChangeBit- 00:08:24.639 [2024-11-17 04:25:03.278762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.639 [2024-11-17 04:25:03.278790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.639 [2024-11-17 04:25:03.278929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.639 [2024-11-17 04:25:03.278946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.639 #15 NEW cov: 12405 ft: 14744 corp: 14/29b lim: 5 exec/s: 0 rss: 72Mb L: 2/3 MS: 1 ShuffleBytes- 00:08:24.639 [2024-11-17 04:25:03.329110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.639 [2024-11-17 04:25:03.329137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.639 [2024-11-17 04:25:03.329253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.639 [2024-11-17 04:25:03.329271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.639 [2024-11-17 04:25:03.329390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.639 [2024-11-17 04:25:03.329407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.898 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:24.898 #16 NEW cov: 12428 ft: 14795 corp: 15/32b lim: 5 exec/s: 16 rss: 73Mb L: 3/3 MS: 1 InsertByte- 00:08:24.899 [2024-11-17 04:25:03.659531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.899 [2024-11-17 04:25:03.659569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.899 #17 NEW cov: 12428 ft: 14895 corp: 16/33b lim: 5 exec/s: 17 rss: 73Mb L: 1/3 MS: 1 CrossOver- 00:08:24.899 [2024-11-17 04:25:03.709614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.899 [2024-11-17 04:25:03.709642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.159 #18 NEW cov: 12428 ft: 14920 corp: 17/34b lim: 5 exec/s: 18 rss: 73Mb L: 1/3 MS: 1 ShuffleBytes- 00:08:25.159 [2024-11-17 04:25:03.779834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.159 [2024-11-17 04:25:03.779864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.159 #19 NEW cov: 12428 ft: 14926 corp: 18/35b lim: 5 exec/s: 19 rss: 73Mb L: 1/3 MS: 1 CopyPart- 00:08:25.159 [2024-11-17 04:25:03.830458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.159 [2024-11-17 04:25:03.830489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.159 [2024-11-17 04:25:03.830620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.159 [2024-11-17 04:25:03.830637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.159 [2024-11-17 04:25:03.830756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.159 [2024-11-17 04:25:03.830772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.159 #20 NEW cov: 12428 ft: 14955 corp: 19/38b lim: 5 exec/s: 20 rss: 73Mb L: 3/3 MS: 1 ShuffleBytes- 00:08:25.159 [2024-11-17 04:25:03.880580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.159 [2024-11-17 04:25:03.880609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.159 [2024-11-17 04:25:03.880743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.159 [2024-11-17 04:25:03.880760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.159 [2024-11-17 04:25:03.880881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.159 [2024-11-17 04:25:03.880897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.159 #21 NEW cov: 12428 ft: 14956 corp: 20/41b lim: 5 exec/s: 21 rss: 73Mb L: 3/3 MS: 1 CopyPart- 00:08:25.159 [2024-11-17 04:25:03.950341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.159 [2024-11-17 04:25:03.950369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.159 #22 NEW cov: 12428 ft: 14967 corp: 21/42b lim: 5 exec/s: 22 rss: 73Mb L: 1/3 MS: 1 EraseBytes- 00:08:25.420 [2024-11-17 04:25:04.000422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.420 [2024-11-17 04:25:04.000451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.420 #23 NEW cov: 12428 ft: 14972 corp: 22/43b lim: 5 exec/s: 23 rss: 74Mb L: 1/3 MS: 1 CopyPart- 00:08:25.420 [2024-11-17 04:25:04.070977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.420 [2024-11-17 04:25:04.071005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.420 [2024-11-17 04:25:04.071135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.420 [2024-11-17 04:25:04.071155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.420 #24 NEW cov: 12428 ft: 14984 corp: 23/45b lim: 5 exec/s: 24 rss: 74Mb L: 2/3 MS: 1 CopyPart- 00:08:25.420 [2024-11-17 04:25:04.131155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.420 [2024-11-17 04:25:04.131182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.420 [2024-11-17 04:25:04.131304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.420 [2024-11-17 04:25:04.131320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.420 #25 NEW cov: 12428 ft: 15057 corp: 24/47b lim: 5 exec/s: 25 rss: 74Mb L: 2/3 MS: 1 ChangeBit- 00:08:25.420 [2024-11-17 04:25:04.191327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.420 [2024-11-17 04:25:04.191354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.420 [2024-11-17 04:25:04.191472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.420 [2024-11-17 04:25:04.191490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.420 #26 NEW cov: 12428 ft: 15093 corp: 25/49b lim: 5 exec/s: 26 rss: 74Mb L: 2/3 MS: 1 ChangeBit- 00:08:25.420 [2024-11-17 04:25:04.241401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.420 [2024-11-17 04:25:04.241430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.420 [2024-11-17 04:25:04.241563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.420 [2024-11-17 04:25:04.241580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.682 #27 NEW cov: 12428 ft: 15106 corp: 26/51b lim: 5 exec/s: 27 rss: 74Mb L: 2/3 MS: 1 InsertByte- 00:08:25.682 [2024-11-17 04:25:04.291307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.682 [2024-11-17 04:25:04.291336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.682 #28 NEW cov: 12428 ft: 15188 corp: 27/52b lim: 5 exec/s: 28 rss: 74Mb L: 1/3 MS: 1 EraseBytes- 00:08:25.682 [2024-11-17 04:25:04.342268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.682 [2024-11-17 04:25:04.342298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.682 [2024-11-17 04:25:04.342413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.682 [2024-11-17 04:25:04.342430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.682 [2024-11-17 04:25:04.342543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.682 [2024-11-17 04:25:04.342561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.682 [2024-11-17 04:25:04.342685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.682 [2024-11-17 04:25:04.342706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.682 #29 NEW cov: 12428 ft: 15480 corp: 28/56b lim: 5 exec/s: 29 rss: 74Mb L: 4/4 MS: 1 InsertByte- 00:08:25.682 [2024-11-17 04:25:04.391910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.682 [2024-11-17 04:25:04.391936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.682 [2024-11-17 04:25:04.392066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.682 [2024-11-17 04:25:04.392081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.682 #30 NEW cov: 12428 ft: 15488 corp: 29/58b lim: 5 exec/s: 30 rss: 74Mb L: 2/4 MS: 1 CopyPart- 00:08:25.682 [2024-11-17 04:25:04.442667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.682 [2024-11-17 04:25:04.442698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.682 [2024-11-17 04:25:04.442821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.682 [2024-11-17 04:25:04.442839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.682 [2024-11-17 04:25:04.442951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.682 [2024-11-17 04:25:04.442967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.682 [2024-11-17 04:25:04.443090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.682 [2024-11-17 04:25:04.443109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.682 #31 NEW cov: 12428 ft: 15512 corp: 30/62b lim: 5 exec/s: 15 rss: 74Mb L: 4/4 MS: 1 CrossOver- 00:08:25.682 #31 DONE cov: 12428 ft: 15512 corp: 30/62b lim: 5 exec/s: 15 rss: 74Mb 00:08:25.682 Done 31 runs in 2 second(s) 00:08:25.943 04:25:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_9.conf /var/tmp/suppress_nvmf_fuzz 00:08:25.943 04:25:04 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:25.943 04:25:04 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:25.943 04:25:04 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:08:25.943 04:25:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:08:25.943 04:25:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:25.943 04:25:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:25.943 04:25:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:25.943 04:25:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:08:25.943 04:25:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:25.943 04:25:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:25.943 04:25:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 10 00:08:25.943 04:25:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4410 00:08:25.943 04:25:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:25.943 04:25:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:08:25.943 04:25:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:25.943 04:25:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:25.943 04:25:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:25.943 04:25:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 00:08:25.943 [2024-11-17 04:25:04.632078] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:25.943 [2024-11-17 04:25:04.632146] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid152973 ] 00:08:26.204 [2024-11-17 04:25:04.839356] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.204 [2024-11-17 04:25:04.853093] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.204 [2024-11-17 04:25:04.906017] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:26.204 [2024-11-17 04:25:04.922300] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:08:26.204 INFO: Running with entropic power schedule (0xFF, 100). 00:08:26.204 INFO: Seed: 751576512 00:08:26.204 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:26.204 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:26.204 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:26.204 INFO: A corpus is not provided, starting from an empty corpus 00:08:26.204 #2 INITED exec/s: 0 rss: 65Mb 00:08:26.204 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:26.204 This may also happen if the target rejected all inputs we tried so far 00:08:26.204 [2024-11-17 04:25:04.981703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e3030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.204 [2024-11-17 04:25:04.981733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.204 [2024-11-17 04:25:04.981805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.204 [2024-11-17 04:25:04.981819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.204 [2024-11-17 04:25:04.981878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.204 [2024-11-17 04:25:04.981893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.204 [2024-11-17 04:25:04.981951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.204 [2024-11-17 04:25:04.981964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.463 NEW_FUNC[1/714]: 0x45f648 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:08:26.463 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:26.463 #17 NEW cov: 12220 ft: 12222 corp: 2/38b lim: 40 exec/s: 0 rss: 72Mb L: 37/37 MS: 5 ShuffleBytes-ShuffleBytes-CMP-CopyPart-InsertRepeatedBytes- DE: "~\000"- 00:08:26.723 [2024-11-17 04:25:05.313565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:230a7e30 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.723 [2024-11-17 04:25:05.313614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.723 [2024-11-17 04:25:05.313776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.723 [2024-11-17 04:25:05.313799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.723 [2024-11-17 04:25:05.313942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.723 [2024-11-17 04:25:05.313963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.723 [2024-11-17 04:25:05.314105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.723 [2024-11-17 04:25:05.314126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.723 NEW_FUNC[1/1]: 0x19ca168 in nvme_qpair_is_admin_queue /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1188 00:08:26.723 #18 NEW cov: 12337 ft: 12941 corp: 3/76b lim: 40 exec/s: 0 rss: 72Mb L: 38/38 MS: 1 InsertByte- 00:08:26.723 [2024-11-17 04:25:05.383591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e3030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.723 [2024-11-17 04:25:05.383621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.723 [2024-11-17 04:25:05.383775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.723 [2024-11-17 04:25:05.383796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.723 [2024-11-17 04:25:05.383933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.723 [2024-11-17 04:25:05.383951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.723 [2024-11-17 04:25:05.384087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.723 [2024-11-17 04:25:05.384105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.723 #19 NEW cov: 12343 ft: 13222 corp: 4/113b lim: 40 exec/s: 0 rss: 72Mb L: 37/38 MS: 1 ShuffleBytes- 00:08:26.723 [2024-11-17 04:25:05.433494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e3030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.723 [2024-11-17 04:25:05.433524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.723 [2024-11-17 04:25:05.433657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.723 [2024-11-17 04:25:05.433674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.723 [2024-11-17 04:25:05.433802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.723 [2024-11-17 04:25:05.433821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.723 #20 NEW cov: 12428 ft: 13980 corp: 5/143b lim: 40 exec/s: 0 rss: 72Mb L: 30/38 MS: 1 EraseBytes- 00:08:26.723 [2024-11-17 04:25:05.503980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e3030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.723 [2024-11-17 04:25:05.504009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.723 [2024-11-17 04:25:05.504146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:303030ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.723 [2024-11-17 04:25:05.504163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.723 [2024-11-17 04:25:05.504294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffff30 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.723 [2024-11-17 04:25:05.504310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.723 [2024-11-17 04:25:05.504439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.723 [2024-11-17 04:25:05.504456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.723 #21 NEW cov: 12428 ft: 14047 corp: 6/181b lim: 40 exec/s: 0 rss: 72Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:26.984 [2024-11-17 04:25:05.574137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e3030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.984 [2024-11-17 04:25:05.574168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.984 [2024-11-17 04:25:05.574302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:383030ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.984 [2024-11-17 04:25:05.574323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.984 [2024-11-17 04:25:05.574453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffff30 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.984 [2024-11-17 04:25:05.574469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.984 [2024-11-17 04:25:05.574602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.984 [2024-11-17 04:25:05.574617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.984 #22 NEW cov: 12428 ft: 14086 corp: 7/219b lim: 40 exec/s: 0 rss: 72Mb L: 38/38 MS: 1 ChangeBit- 00:08:26.984 [2024-11-17 04:25:05.644330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:230a7e30 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.984 [2024-11-17 04:25:05.644359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.984 [2024-11-17 04:25:05.644491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.984 [2024-11-17 04:25:05.644510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.984 [2024-11-17 04:25:05.644647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.984 [2024-11-17 04:25:05.644662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.984 [2024-11-17 04:25:05.644810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.984 [2024-11-17 04:25:05.644825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.984 #23 NEW cov: 12428 ft: 14141 corp: 8/258b lim: 40 exec/s: 0 rss: 72Mb L: 39/39 MS: 1 CrossOver- 00:08:26.984 [2024-11-17 04:25:05.714305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e3030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.984 [2024-11-17 04:25:05.714333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.984 [2024-11-17 04:25:05.714461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303032 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.984 [2024-11-17 04:25:05.714477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.984 [2024-11-17 04:25:05.714606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.984 [2024-11-17 04:25:05.714622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.984 #24 NEW cov: 12428 ft: 14171 corp: 9/289b lim: 40 exec/s: 0 rss: 72Mb L: 31/39 MS: 1 InsertByte- 00:08:26.984 [2024-11-17 04:25:05.764702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e2330 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.984 [2024-11-17 04:25:05.764729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.984 [2024-11-17 04:25:05.764846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.984 [2024-11-17 04:25:05.764866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.984 [2024-11-17 04:25:05.764999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.984 [2024-11-17 04:25:05.765016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.984 [2024-11-17 04:25:05.765136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.984 [2024-11-17 04:25:05.765153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.984 #25 NEW cov: 12428 ft: 14218 corp: 10/327b lim: 40 exec/s: 0 rss: 72Mb L: 38/39 MS: 1 ShuffleBytes- 00:08:27.245 [2024-11-17 04:25:05.814862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e3030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.245 [2024-11-17 04:25:05.814891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.245 [2024-11-17 04:25:05.815021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.245 [2024-11-17 04:25:05.815037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.245 [2024-11-17 04:25:05.815165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:10303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.245 [2024-11-17 04:25:05.815181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.245 [2024-11-17 04:25:05.815305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.245 [2024-11-17 04:25:05.815321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.245 #26 NEW cov: 12428 ft: 14318 corp: 11/364b lim: 40 exec/s: 0 rss: 72Mb L: 37/39 MS: 1 ChangeBit- 00:08:27.245 [2024-11-17 04:25:05.854754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e2330 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.245 [2024-11-17 04:25:05.854795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.245 [2024-11-17 04:25:05.854937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.245 [2024-11-17 04:25:05.854953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.245 [2024-11-17 04:25:05.855083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:3030307e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.245 [2024-11-17 04:25:05.855098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.245 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:27.245 #27 NEW cov: 12451 ft: 14369 corp: 12/388b lim: 40 exec/s: 0 rss: 73Mb L: 24/39 MS: 1 EraseBytes- 00:08:27.245 [2024-11-17 04:25:05.915049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e2330 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.245 [2024-11-17 04:25:05.915080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.245 [2024-11-17 04:25:05.915208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.245 [2024-11-17 04:25:05.915225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.245 [2024-11-17 04:25:05.915349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:307a3030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.245 [2024-11-17 04:25:05.915364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.245 [2024-11-17 04:25:05.915489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.245 [2024-11-17 04:25:05.915506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.245 #28 NEW cov: 12451 ft: 14459 corp: 13/427b lim: 40 exec/s: 0 rss: 73Mb L: 39/39 MS: 1 InsertByte- 00:08:27.245 [2024-11-17 04:25:05.955150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e3030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.245 [2024-11-17 04:25:05.955178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.245 [2024-11-17 04:25:05.955309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.246 [2024-11-17 04:25:05.955325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.246 [2024-11-17 04:25:05.955459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:3030d030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.246 [2024-11-17 04:25:05.955474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.246 [2024-11-17 04:25:05.955593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.246 [2024-11-17 04:25:05.955609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.246 #29 NEW cov: 12451 ft: 14478 corp: 14/464b lim: 40 exec/s: 29 rss: 73Mb L: 37/39 MS: 1 ChangeBinInt- 00:08:27.246 [2024-11-17 04:25:06.005388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e3030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.246 [2024-11-17 04:25:06.005416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.246 [2024-11-17 04:25:06.005544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.246 [2024-11-17 04:25:06.005561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.246 [2024-11-17 04:25:06.005677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:45303030 cdw11:3030d030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.246 [2024-11-17 04:25:06.005698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.246 [2024-11-17 04:25:06.005830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.246 [2024-11-17 04:25:06.005846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.246 #30 NEW cov: 12451 ft: 14494 corp: 15/501b lim: 40 exec/s: 30 rss: 73Mb L: 37/39 MS: 1 ChangeByte- 00:08:27.246 [2024-11-17 04:25:06.065536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e3030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.246 [2024-11-17 04:25:06.065563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.246 [2024-11-17 04:25:06.065691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:3030302c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.246 [2024-11-17 04:25:06.065713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.246 [2024-11-17 04:25:06.065852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.246 [2024-11-17 04:25:06.065867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.246 [2024-11-17 04:25:06.065991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.246 [2024-11-17 04:25:06.066007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.506 #31 NEW cov: 12451 ft: 14555 corp: 16/539b lim: 40 exec/s: 31 rss: 73Mb L: 38/39 MS: 1 InsertByte- 00:08:27.506 [2024-11-17 04:25:06.105587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e3030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.506 [2024-11-17 04:25:06.105616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.507 [2024-11-17 04:25:06.105767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.507 [2024-11-17 04:25:06.105785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.507 [2024-11-17 04:25:06.105929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.507 [2024-11-17 04:25:06.105944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.507 [2024-11-17 04:25:06.106076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.507 [2024-11-17 04:25:06.106093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.507 #32 NEW cov: 12451 ft: 14556 corp: 17/576b lim: 40 exec/s: 32 rss: 73Mb L: 37/39 MS: 1 ShuffleBytes- 00:08:27.507 [2024-11-17 04:25:06.145701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e3030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.507 [2024-11-17 04:25:06.145728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.507 [2024-11-17 04:25:06.145859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.507 [2024-11-17 04:25:06.145875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.507 [2024-11-17 04:25:06.146011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:3030d030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.507 [2024-11-17 04:25:06.146030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.507 [2024-11-17 04:25:06.146149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.507 [2024-11-17 04:25:06.146165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.507 #33 NEW cov: 12451 ft: 14567 corp: 18/613b lim: 40 exec/s: 33 rss: 73Mb L: 37/39 MS: 1 ShuffleBytes- 00:08:27.507 [2024-11-17 04:25:06.195936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e3030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.507 [2024-11-17 04:25:06.195963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.507 [2024-11-17 04:25:06.196102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.507 [2024-11-17 04:25:06.196118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.507 [2024-11-17 04:25:06.196247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.507 [2024-11-17 04:25:06.196264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.507 [2024-11-17 04:25:06.196390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:3030307e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.507 [2024-11-17 04:25:06.196407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.507 #34 NEW cov: 12451 ft: 14582 corp: 19/650b lim: 40 exec/s: 34 rss: 73Mb L: 37/39 MS: 1 PersAutoDict- DE: "~\000"- 00:08:27.507 [2024-11-17 04:25:06.266142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e3030 cdw11:30313030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.507 [2024-11-17 04:25:06.266169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.507 [2024-11-17 04:25:06.266309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.507 [2024-11-17 04:25:06.266327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.507 [2024-11-17 04:25:06.266454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:3030d030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.507 [2024-11-17 04:25:06.266471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.507 [2024-11-17 04:25:06.266595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.507 [2024-11-17 04:25:06.266612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.507 #35 NEW cov: 12451 ft: 14665 corp: 20/687b lim: 40 exec/s: 35 rss: 73Mb L: 37/39 MS: 1 ChangeBit- 00:08:27.507 [2024-11-17 04:25:06.306189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e3030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.507 [2024-11-17 04:25:06.306216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.507 [2024-11-17 04:25:06.306364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.507 [2024-11-17 04:25:06.306385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.507 [2024-11-17 04:25:06.306516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:45303030 cdw11:3030d030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.507 [2024-11-17 04:25:06.306532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.507 [2024-11-17 04:25:06.306664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30110000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.507 [2024-11-17 04:25:06.306680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.768 #36 NEW cov: 12451 ft: 14679 corp: 21/724b lim: 40 exec/s: 36 rss: 73Mb L: 37/39 MS: 1 CMP- DE: "\021\000\000\000\000\000\000\000"- 00:08:27.768 [2024-11-17 04:25:06.376419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2a7e3030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.768 [2024-11-17 04:25:06.376446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.768 [2024-11-17 04:25:06.376582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.768 [2024-11-17 04:25:06.376599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.768 [2024-11-17 04:25:06.376735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.768 [2024-11-17 04:25:06.376752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.768 [2024-11-17 04:25:06.376885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.768 [2024-11-17 04:25:06.376900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.768 #37 NEW cov: 12451 ft: 14735 corp: 22/761b lim: 40 exec/s: 37 rss: 73Mb L: 37/39 MS: 1 ChangeBit- 00:08:27.768 [2024-11-17 04:25:06.426237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e2330 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.768 [2024-11-17 04:25:06.426263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.768 [2024-11-17 04:25:06.426405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.768 [2024-11-17 04:25:06.426421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.768 [2024-11-17 04:25:06.426550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:3030347e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.768 [2024-11-17 04:25:06.426566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.768 #38 NEW cov: 12451 ft: 14739 corp: 23/785b lim: 40 exec/s: 38 rss: 73Mb L: 24/39 MS: 1 ChangeBit- 00:08:27.768 [2024-11-17 04:25:06.486690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e3030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.768 [2024-11-17 04:25:06.486722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.768 [2024-11-17 04:25:06.486859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3030303f cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.768 [2024-11-17 04:25:06.486876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.768 [2024-11-17 04:25:06.487006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.768 [2024-11-17 04:25:06.487023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.768 [2024-11-17 04:25:06.487153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.768 [2024-11-17 04:25:06.487170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.768 #39 NEW cov: 12451 ft: 14752 corp: 24/822b lim: 40 exec/s: 39 rss: 73Mb L: 37/39 MS: 1 ChangeByte- 00:08:27.768 [2024-11-17 04:25:06.526854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e2330 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.768 [2024-11-17 04:25:06.526883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.768 [2024-11-17 04:25:06.527030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.768 [2024-11-17 04:25:06.527047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.768 [2024-11-17 04:25:06.527176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30301100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.768 [2024-11-17 04:25:06.527194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.768 [2024-11-17 04:25:06.527332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00003030 cdw11:3030347e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.768 [2024-11-17 04:25:06.527350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.768 #40 NEW cov: 12451 ft: 14761 corp: 25/854b lim: 40 exec/s: 40 rss: 73Mb L: 32/39 MS: 1 PersAutoDict- DE: "\021\000\000\000\000\000\000\000"- 00:08:27.768 [2024-11-17 04:25:06.597119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e3030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.768 [2024-11-17 04:25:06.597146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.768 [2024-11-17 04:25:06.597279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:383030ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.768 [2024-11-17 04:25:06.597296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.029 [2024-11-17 04:25:06.597426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffff00 cdw11:00303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.029 [2024-11-17 04:25:06.597444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.029 [2024-11-17 04:25:06.597575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.029 [2024-11-17 04:25:06.597591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.029 #41 NEW cov: 12451 ft: 14767 corp: 26/892b lim: 40 exec/s: 41 rss: 73Mb L: 38/39 MS: 1 CMP- DE: "\000\000"- 00:08:28.029 [2024-11-17 04:25:06.667324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a303030 cdw11:30383030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.029 [2024-11-17 04:25:06.667352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.029 [2024-11-17 04:25:06.667499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.029 [2024-11-17 04:25:06.667514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.029 [2024-11-17 04:25:06.667635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00003030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.029 [2024-11-17 04:25:06.667661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.029 [2024-11-17 04:25:06.667815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.029 [2024-11-17 04:25:06.667834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.029 #42 NEW cov: 12451 ft: 14788 corp: 27/927b lim: 40 exec/s: 42 rss: 73Mb L: 35/39 MS: 1 EraseBytes- 00:08:28.029 [2024-11-17 04:25:06.737450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a303030 cdw11:30383030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.029 [2024-11-17 04:25:06.737476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.029 [2024-11-17 04:25:06.737604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.029 [2024-11-17 04:25:06.737621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.029 [2024-11-17 04:25:06.737771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00003030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.029 [2024-11-17 04:25:06.737788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.029 [2024-11-17 04:25:06.737926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.029 [2024-11-17 04:25:06.737943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.029 #43 NEW cov: 12451 ft: 14832 corp: 28/966b lim: 40 exec/s: 43 rss: 73Mb L: 39/39 MS: 1 CMP- DE: "\000\000\001X"- 00:08:28.029 [2024-11-17 04:25:06.807619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e2330 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.029 [2024-11-17 04:25:06.807646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.029 [2024-11-17 04:25:06.807782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.029 [2024-11-17 04:25:06.807800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.029 [2024-11-17 04:25:06.807927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.029 [2024-11-17 04:25:06.807944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.029 [2024-11-17 04:25:06.808080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.029 [2024-11-17 04:25:06.808097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.029 #44 NEW cov: 12451 ft: 14849 corp: 29/1004b lim: 40 exec/s: 44 rss: 74Mb L: 38/39 MS: 1 CopyPart- 00:08:28.029 [2024-11-17 04:25:06.857924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e2330 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.029 [2024-11-17 04:25:06.857951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.029 [2024-11-17 04:25:06.858088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.029 [2024-11-17 04:25:06.858106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.029 [2024-11-17 04:25:06.858236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.029 [2024-11-17 04:25:06.858253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.290 [2024-11-17 04:25:06.858373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.290 [2024-11-17 04:25:06.858391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.290 #45 NEW cov: 12451 ft: 14850 corp: 30/1043b lim: 40 exec/s: 45 rss: 74Mb L: 39/39 MS: 1 CopyPart- 00:08:28.290 [2024-11-17 04:25:06.908034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e2330 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.290 [2024-11-17 04:25:06.908062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.290 [2024-11-17 04:25:06.908209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.290 [2024-11-17 04:25:06.908227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.290 [2024-11-17 04:25:06.908362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.290 [2024-11-17 04:25:06.908381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.290 [2024-11-17 04:25:06.908514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.290 [2024-11-17 04:25:06.908529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.290 #46 NEW cov: 12451 ft: 14873 corp: 31/1081b lim: 40 exec/s: 46 rss: 74Mb L: 38/39 MS: 1 CrossOver- 00:08:28.290 [2024-11-17 04:25:06.978258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a7e3030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.290 [2024-11-17 04:25:06.978286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.290 [2024-11-17 04:25:06.978425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.290 [2024-11-17 04:25:06.978446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.290 [2024-11-17 04:25:06.978581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:10303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.290 [2024-11-17 04:25:06.978598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.290 [2024-11-17 04:25:06.978735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.290 [2024-11-17 04:25:06.978753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.290 #47 NEW cov: 12451 ft: 14888 corp: 32/1118b lim: 40 exec/s: 23 rss: 74Mb L: 37/39 MS: 1 ChangeBinInt- 00:08:28.290 #47 DONE cov: 12451 ft: 14888 corp: 32/1118b lim: 40 exec/s: 23 rss: 74Mb 00:08:28.290 ###### Recommended dictionary. ###### 00:08:28.290 "~\000" # Uses: 1 00:08:28.290 "\021\000\000\000\000\000\000\000" # Uses: 1 00:08:28.290 "\000\000" # Uses: 0 00:08:28.290 "\000\000\001X" # Uses: 0 00:08:28.290 ###### End of recommended dictionary. ###### 00:08:28.290 Done 47 runs in 2 second(s) 00:08:28.290 04:25:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_10.conf /var/tmp/suppress_nvmf_fuzz 00:08:28.551 04:25:07 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:28.551 04:25:07 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:28.551 04:25:07 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:28.551 04:25:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:28.551 04:25:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:28.551 04:25:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:28.551 04:25:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:28.551 04:25:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:28.551 04:25:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:28.551 04:25:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:28.551 04:25:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 11 00:08:28.551 04:25:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4411 00:08:28.551 04:25:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:28.551 04:25:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:28.551 04:25:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:28.551 04:25:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:28.551 04:25:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:28.551 04:25:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 00:08:28.551 [2024-11-17 04:25:07.169455] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:28.551 [2024-11-17 04:25:07.169525] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid153371 ] 00:08:28.551 [2024-11-17 04:25:07.369369] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.811 [2024-11-17 04:25:07.383306] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.811 [2024-11-17 04:25:07.435911] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:28.811 [2024-11-17 04:25:07.452203] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:28.811 INFO: Running with entropic power schedule (0xFF, 100). 00:08:28.811 INFO: Seed: 3283567333 00:08:28.811 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:28.811 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:28.811 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:28.811 INFO: A corpus is not provided, starting from an empty corpus 00:08:28.811 #2 INITED exec/s: 0 rss: 65Mb 00:08:28.811 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:28.811 This may also happen if the target rejected all inputs we tried so far 00:08:28.811 [2024-11-17 04:25:07.497087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:40000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.811 [2024-11-17 04:25:07.497122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.071 NEW_FUNC[1/716]: 0x4613b8 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:29.071 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:29.071 #17 NEW cov: 12236 ft: 12231 corp: 2/16b lim: 40 exec/s: 0 rss: 72Mb L: 15/15 MS: 5 ChangeBit-CopyPart-InsertByte-ChangeByte-InsertRepeatedBytes- 00:08:29.071 [2024-11-17 04:25:07.857969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:40000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.071 [2024-11-17 04:25:07.858010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.331 #18 NEW cov: 12349 ft: 12607 corp: 3/31b lim: 40 exec/s: 0 rss: 72Mb L: 15/15 MS: 1 ChangeByte- 00:08:29.331 [2024-11-17 04:25:07.948075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:40000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.331 [2024-11-17 04:25:07.948106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.331 #19 NEW cov: 12355 ft: 12960 corp: 4/46b lim: 40 exec/s: 0 rss: 72Mb L: 15/15 MS: 1 ChangeBinInt- 00:08:29.331 [2024-11-17 04:25:08.038349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a400000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.331 [2024-11-17 04:25:08.038379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.331 [2024-11-17 04:25:08.038427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000026 cdw11:00000050 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.331 [2024-11-17 04:25:08.038443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.331 #21 NEW cov: 12440 ft: 13977 corp: 5/62b lim: 40 exec/s: 0 rss: 72Mb L: 16/16 MS: 2 CopyPart-CrossOver- 00:08:29.331 [2024-11-17 04:25:08.098477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a400000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.331 [2024-11-17 04:25:08.098506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.331 [2024-11-17 04:25:08.098553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000026 cdw11:00000050 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.331 [2024-11-17 04:25:08.098569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.591 #22 NEW cov: 12440 ft: 14076 corp: 6/78b lim: 40 exec/s: 0 rss: 72Mb L: 16/16 MS: 1 ShuffleBytes- 00:08:29.591 [2024-11-17 04:25:08.188925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:40000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.591 [2024-11-17 04:25:08.188954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.591 [2024-11-17 04:25:08.189004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0000da40 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.591 [2024-11-17 04:25:08.189020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.591 [2024-11-17 04:25:08.189050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.591 [2024-11-17 04:25:08.189065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.591 #23 NEW cov: 12440 ft: 14416 corp: 7/108b lim: 40 exec/s: 0 rss: 72Mb L: 30/30 MS: 1 CrossOver- 00:08:29.591 [2024-11-17 04:25:08.279123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:40000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.591 [2024-11-17 04:25:08.279153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.591 [2024-11-17 04:25:08.279186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.591 [2024-11-17 04:25:08.279201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.591 [2024-11-17 04:25:08.279231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.592 [2024-11-17 04:25:08.279247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.592 #24 NEW cov: 12440 ft: 14458 corp: 8/137b lim: 40 exec/s: 0 rss: 72Mb L: 29/30 MS: 1 InsertRepeatedBytes- 00:08:29.592 [2024-11-17 04:25:08.339181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a400000 cdw11:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.592 [2024-11-17 04:25:08.339210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.592 [2024-11-17 04:25:08.339257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffff0b26 cdw11:00000050 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.592 [2024-11-17 04:25:08.339272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.592 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:29.592 #25 NEW cov: 12457 ft: 14511 corp: 9/153b lim: 40 exec/s: 0 rss: 73Mb L: 16/30 MS: 1 CMP- DE: "\377\377\377\013"- 00:08:29.852 [2024-11-17 04:25:08.429462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a400000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.852 [2024-11-17 04:25:08.429493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.852 [2024-11-17 04:25:08.429526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000026 cdw11:0000004c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.852 [2024-11-17 04:25:08.429542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.852 #26 NEW cov: 12457 ft: 14568 corp: 10/169b lim: 40 exec/s: 0 rss: 73Mb L: 16/30 MS: 1 ChangeBinInt- 00:08:29.852 [2024-11-17 04:25:08.489478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a400000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.852 [2024-11-17 04:25:08.489512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.852 #27 NEW cov: 12457 ft: 14592 corp: 11/183b lim: 40 exec/s: 27 rss: 73Mb L: 14/30 MS: 1 CrossOver- 00:08:29.852 [2024-11-17 04:25:08.579928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a400000 cdw11:000a4000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.852 [2024-11-17 04:25:08.579958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.852 [2024-11-17 04:25:08.580006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.852 [2024-11-17 04:25:08.580022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.852 [2024-11-17 04:25:08.580051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000026 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.852 [2024-11-17 04:25:08.580067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.852 #28 NEW cov: 12457 ft: 14674 corp: 12/207b lim: 40 exec/s: 28 rss: 73Mb L: 24/30 MS: 1 CrossOver- 00:08:29.852 [2024-11-17 04:25:08.639905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:40000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.852 [2024-11-17 04:25:08.639937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.852 #29 NEW cov: 12457 ft: 14687 corp: 13/222b lim: 40 exec/s: 29 rss: 73Mb L: 15/30 MS: 1 ChangeBinInt- 00:08:30.112 [2024-11-17 04:25:08.690041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:40000000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.112 [2024-11-17 04:25:08.690071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.112 #30 NEW cov: 12457 ft: 14774 corp: 14/237b lim: 40 exec/s: 30 rss: 73Mb L: 15/30 MS: 1 ChangeBit- 00:08:30.112 [2024-11-17 04:25:08.740233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:dafffff7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.112 [2024-11-17 04:25:08.740264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.112 [2024-11-17 04:25:08.740297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:40000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.112 [2024-11-17 04:25:08.740313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.112 #31 NEW cov: 12457 ft: 14815 corp: 15/260b lim: 40 exec/s: 31 rss: 73Mb L: 23/30 MS: 1 CopyPart- 00:08:30.112 [2024-11-17 04:25:08.800348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a400000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.112 [2024-11-17 04:25:08.800377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.112 [2024-11-17 04:25:08.800425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000026 cdw11:0000f64f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.112 [2024-11-17 04:25:08.800441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.112 #32 NEW cov: 12457 ft: 14828 corp: 16/276b lim: 40 exec/s: 32 rss: 73Mb L: 16/30 MS: 1 ChangeBinInt- 00:08:30.112 [2024-11-17 04:25:08.850503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:dafffff7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.112 [2024-11-17 04:25:08.850537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.112 [2024-11-17 04:25:08.850585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:40000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.112 [2024-11-17 04:25:08.850600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.112 #33 NEW cov: 12457 ft: 14842 corp: 17/299b lim: 40 exec/s: 33 rss: 73Mb L: 23/30 MS: 1 CopyPart- 00:08:30.112 [2024-11-17 04:25:08.940894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0abfffff cdw11:ff0a4000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.112 [2024-11-17 04:25:08.940925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.112 [2024-11-17 04:25:08.940974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.112 [2024-11-17 04:25:08.940990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.112 [2024-11-17 04:25:08.941021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000026 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.112 [2024-11-17 04:25:08.941038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.372 #34 NEW cov: 12457 ft: 14866 corp: 18/323b lim: 40 exec/s: 34 rss: 73Mb L: 24/30 MS: 1 ChangeBinInt- 00:08:30.372 [2024-11-17 04:25:09.041758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00daffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.372 [2024-11-17 04:25:09.041784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.372 [2024-11-17 04:25:09.041843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:f7400000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.372 [2024-11-17 04:25:09.041857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.372 #37 NEW cov: 12457 ft: 15014 corp: 19/345b lim: 40 exec/s: 37 rss: 73Mb L: 22/30 MS: 3 CopyPart-EraseBytes-CrossOver- 00:08:30.372 [2024-11-17 04:25:09.081956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0abfffff cdw11:ff0a4000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.372 [2024-11-17 04:25:09.081981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.372 [2024-11-17 04:25:09.082055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000108 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.372 [2024-11-17 04:25:09.082070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.372 [2024-11-17 04:25:09.082128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000026 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.372 [2024-11-17 04:25:09.082141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.373 #38 NEW cov: 12457 ft: 15069 corp: 20/369b lim: 40 exec/s: 38 rss: 73Mb L: 24/30 MS: 1 ChangeBinInt- 00:08:30.373 [2024-11-17 04:25:09.142016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a400000 cdw11:004f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.373 [2024-11-17 04:25:09.142040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.373 [2024-11-17 04:25:09.142118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000026 cdw11:00000050 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.373 [2024-11-17 04:25:09.142132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.373 #39 NEW cov: 12457 ft: 15107 corp: 21/385b lim: 40 exec/s: 39 rss: 73Mb L: 16/30 MS: 1 ChangeByte- 00:08:30.373 [2024-11-17 04:25:09.181929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:40000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.373 [2024-11-17 04:25:09.181954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.632 #40 NEW cov: 12457 ft: 15135 corp: 22/400b lim: 40 exec/s: 40 rss: 73Mb L: 15/30 MS: 1 CopyPart- 00:08:30.632 [2024-11-17 04:25:09.222018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:40000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.632 [2024-11-17 04:25:09.222043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.632 #41 NEW cov: 12457 ft: 15143 corp: 23/408b lim: 40 exec/s: 41 rss: 73Mb L: 8/30 MS: 1 EraseBytes- 00:08:30.632 [2024-11-17 04:25:09.262313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a400000 cdw11:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.632 [2024-11-17 04:25:09.262338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.632 [2024-11-17 04:25:09.262414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffff0b00 cdw11:00000026 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.632 [2024-11-17 04:25:09.262428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.632 #42 NEW cov: 12457 ft: 15159 corp: 24/428b lim: 40 exec/s: 42 rss: 73Mb L: 20/30 MS: 1 PersAutoDict- DE: "\377\377\377\013"- 00:08:30.632 [2024-11-17 04:25:09.302258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:40000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.632 [2024-11-17 04:25:09.302283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.632 #43 NEW cov: 12457 ft: 15235 corp: 25/442b lim: 40 exec/s: 43 rss: 73Mb L: 14/30 MS: 1 EraseBytes- 00:08:30.632 [2024-11-17 04:25:09.362752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:dafffff7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.633 [2024-11-17 04:25:09.362777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.633 [2024-11-17 04:25:09.362836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:40000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.633 [2024-11-17 04:25:09.362851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.633 [2024-11-17 04:25:09.362908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:21fff750 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.633 [2024-11-17 04:25:09.362921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.633 #44 NEW cov: 12464 ft: 15256 corp: 26/466b lim: 40 exec/s: 44 rss: 73Mb L: 24/30 MS: 1 InsertByte- 00:08:30.633 [2024-11-17 04:25:09.423085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0abfffff cdw11:ff0a4000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.633 [2024-11-17 04:25:09.423110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.633 [2024-11-17 04:25:09.423172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000108 cdw11:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.633 [2024-11-17 04:25:09.423186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.633 [2024-11-17 04:25:09.423241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.633 [2024-11-17 04:25:09.423254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.633 [2024-11-17 04:25:09.423311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.633 [2024-11-17 04:25:09.423324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.895 #45 NEW cov: 12464 ft: 15623 corp: 27/503b lim: 40 exec/s: 45 rss: 73Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:08:30.895 [2024-11-17 04:25:09.483242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:40000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.895 [2024-11-17 04:25:09.483267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.895 [2024-11-17 04:25:09.483325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.895 [2024-11-17 04:25:09.483338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.895 [2024-11-17 04:25:09.483393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.895 [2024-11-17 04:25:09.483406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.895 [2024-11-17 04:25:09.483464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.895 [2024-11-17 04:25:09.483477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.895 #46 NEW cov: 12464 ft: 15633 corp: 28/537b lim: 40 exec/s: 23 rss: 73Mb L: 34/37 MS: 1 CrossOver- 00:08:30.895 #46 DONE cov: 12464 ft: 15633 corp: 28/537b lim: 40 exec/s: 23 rss: 73Mb 00:08:30.895 ###### Recommended dictionary. ###### 00:08:30.895 "\377\377\377\013" # Uses: 1 00:08:30.895 ###### End of recommended dictionary. ###### 00:08:30.895 Done 46 runs in 2 second(s) 00:08:30.895 04:25:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_11.conf /var/tmp/suppress_nvmf_fuzz 00:08:30.895 04:25:09 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:30.895 04:25:09 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:30.895 04:25:09 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:30.895 04:25:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:30.895 04:25:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:30.895 04:25:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:30.895 04:25:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:30.895 04:25:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:30.895 04:25:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:30.895 04:25:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:30.895 04:25:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 12 00:08:30.895 04:25:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4412 00:08:30.895 04:25:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:30.895 04:25:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:30.895 04:25:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:30.896 04:25:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:30.896 04:25:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:30.896 04:25:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 00:08:30.896 [2024-11-17 04:25:09.669327] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:30.896 [2024-11-17 04:25:09.669401] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid153792 ] 00:08:31.156 [2024-11-17 04:25:09.873692] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:31.156 [2024-11-17 04:25:09.886615] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.156 [2024-11-17 04:25:09.939097] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:31.156 [2024-11-17 04:25:09.955404] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:31.156 INFO: Running with entropic power schedule (0xFF, 100). 00:08:31.156 INFO: Seed: 1489601031 00:08:31.416 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:31.416 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:31.416 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:31.416 INFO: A corpus is not provided, starting from an empty corpus 00:08:31.416 #2 INITED exec/s: 0 rss: 65Mb 00:08:31.416 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:31.416 This may also happen if the target rejected all inputs we tried so far 00:08:31.416 [2024-11-17 04:25:10.025905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.416 [2024-11-17 04:25:10.025945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.416 [2024-11-17 04:25:10.026102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.416 [2024-11-17 04:25:10.026121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.676 NEW_FUNC[1/716]: 0x463128 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:31.676 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:31.676 #19 NEW cov: 12234 ft: 12235 corp: 2/18b lim: 40 exec/s: 0 rss: 72Mb L: 17/17 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:31.676 [2024-11-17 04:25:10.377152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.676 [2024-11-17 04:25:10.377198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.676 [2024-11-17 04:25:10.377331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.676 [2024-11-17 04:25:10.377354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.676 [2024-11-17 04:25:10.377479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.676 [2024-11-17 04:25:10.377497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.676 [2024-11-17 04:25:10.377624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.676 [2024-11-17 04:25:10.377642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.676 #25 NEW cov: 12347 ft: 13157 corp: 3/51b lim: 40 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:31.676 [2024-11-17 04:25:10.456660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.676 [2024-11-17 04:25:10.456690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.676 [2024-11-17 04:25:10.456818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.676 [2024-11-17 04:25:10.456836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.676 #37 NEW cov: 12353 ft: 13402 corp: 4/70b lim: 40 exec/s: 0 rss: 72Mb L: 19/33 MS: 2 CrossOver-CrossOver- 00:08:31.676 [2024-11-17 04:25:10.497352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.676 [2024-11-17 04:25:10.497380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.676 [2024-11-17 04:25:10.497503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00600000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.676 [2024-11-17 04:25:10.497520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.676 [2024-11-17 04:25:10.497636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.676 [2024-11-17 04:25:10.497652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.676 [2024-11-17 04:25:10.497774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.676 [2024-11-17 04:25:10.497790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.936 #38 NEW cov: 12438 ft: 13722 corp: 5/104b lim: 40 exec/s: 0 rss: 72Mb L: 34/34 MS: 1 InsertByte- 00:08:31.936 [2024-11-17 04:25:10.567661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000031 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.936 [2024-11-17 04:25:10.567687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.936 [2024-11-17 04:25:10.567815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00006000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.936 [2024-11-17 04:25:10.567831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.936 [2024-11-17 04:25:10.567951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.936 [2024-11-17 04:25:10.567970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.936 [2024-11-17 04:25:10.568089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.936 [2024-11-17 04:25:10.568106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.936 #39 NEW cov: 12438 ft: 13805 corp: 6/139b lim: 40 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 InsertByte- 00:08:31.936 [2024-11-17 04:25:10.636893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.936 [2024-11-17 04:25:10.636921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.936 #40 NEW cov: 12438 ft: 14615 corp: 7/154b lim: 40 exec/s: 0 rss: 72Mb L: 15/35 MS: 1 EraseBytes- 00:08:31.936 [2024-11-17 04:25:10.677090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.936 [2024-11-17 04:25:10.677118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.936 #43 NEW cov: 12438 ft: 14741 corp: 8/165b lim: 40 exec/s: 0 rss: 72Mb L: 11/35 MS: 3 ChangeBit-ShuffleBytes-CrossOver- 00:08:31.936 [2024-11-17 04:25:10.727230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.936 [2024-11-17 04:25:10.727257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.936 #46 NEW cov: 12438 ft: 14835 corp: 9/173b lim: 40 exec/s: 0 rss: 72Mb L: 8/35 MS: 3 CrossOver-CrossOver-CrossOver- 00:08:32.196 [2024-11-17 04:25:10.777565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.196 [2024-11-17 04:25:10.777592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.196 [2024-11-17 04:25:10.777716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.196 [2024-11-17 04:25:10.777734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.196 #47 NEW cov: 12438 ft: 14879 corp: 10/193b lim: 40 exec/s: 0 rss: 72Mb L: 20/35 MS: 1 InsertByte- 00:08:32.196 [2024-11-17 04:25:10.847775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.196 [2024-11-17 04:25:10.847800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.196 [2024-11-17 04:25:10.847923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.196 [2024-11-17 04:25:10.847939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.196 #48 NEW cov: 12438 ft: 14911 corp: 11/213b lim: 40 exec/s: 0 rss: 73Mb L: 20/35 MS: 1 ShuffleBytes- 00:08:32.196 [2024-11-17 04:25:10.918599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.196 [2024-11-17 04:25:10.918626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.196 [2024-11-17 04:25:10.918761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.196 [2024-11-17 04:25:10.918778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.196 [2024-11-17 04:25:10.918907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.196 [2024-11-17 04:25:10.918923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.196 [2024-11-17 04:25:10.919048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.196 [2024-11-17 04:25:10.919064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.196 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:32.196 #57 NEW cov: 12461 ft: 14953 corp: 12/252b lim: 40 exec/s: 0 rss: 73Mb L: 39/39 MS: 4 CopyPart-CrossOver-ChangeByte-InsertRepeatedBytes- 00:08:32.196 [2024-11-17 04:25:10.958173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.197 [2024-11-17 04:25:10.958201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.197 [2024-11-17 04:25:10.958337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.197 [2024-11-17 04:25:10.958355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.197 #58 NEW cov: 12461 ft: 14996 corp: 13/269b lim: 40 exec/s: 0 rss: 73Mb L: 17/39 MS: 1 EraseBytes- 00:08:32.197 [2024-11-17 04:25:11.008791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.197 [2024-11-17 04:25:11.008819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.197 [2024-11-17 04:25:11.008946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.197 [2024-11-17 04:25:11.008962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.197 [2024-11-17 04:25:11.009086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.197 [2024-11-17 04:25:11.009101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.197 [2024-11-17 04:25:11.009223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.197 [2024-11-17 04:25:11.009243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.457 #59 NEW cov: 12461 ft: 15022 corp: 14/308b lim: 40 exec/s: 59 rss: 73Mb L: 39/39 MS: 1 ShuffleBytes- 00:08:32.457 [2024-11-17 04:25:11.078529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:31000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.457 [2024-11-17 04:25:11.078556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.457 [2024-11-17 04:25:11.078681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.457 [2024-11-17 04:25:11.078699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.457 #60 NEW cov: 12461 ft: 15034 corp: 15/325b lim: 40 exec/s: 60 rss: 73Mb L: 17/39 MS: 1 ChangeByte- 00:08:32.457 [2024-11-17 04:25:11.148711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.457 [2024-11-17 04:25:11.148738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.457 [2024-11-17 04:25:11.148865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.457 [2024-11-17 04:25:11.148882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.457 #61 NEW cov: 12461 ft: 15104 corp: 16/345b lim: 40 exec/s: 61 rss: 73Mb L: 20/39 MS: 1 ChangeBinInt- 00:08:32.457 [2024-11-17 04:25:11.198861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.457 [2024-11-17 04:25:11.198890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.457 [2024-11-17 04:25:11.199009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:d8d8d8d8 cdw11:d8d8d8d8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.457 [2024-11-17 04:25:11.199027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.457 #62 NEW cov: 12461 ft: 15139 corp: 17/367b lim: 40 exec/s: 62 rss: 73Mb L: 22/39 MS: 1 EraseBytes- 00:08:32.457 [2024-11-17 04:25:11.249147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.457 [2024-11-17 04:25:11.249175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.457 [2024-11-17 04:25:11.249297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00600000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.457 [2024-11-17 04:25:11.249312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.457 [2024-11-17 04:25:11.249430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.457 [2024-11-17 04:25:11.249445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.457 [2024-11-17 04:25:11.249558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.457 [2024-11-17 04:25:11.249574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.457 #63 NEW cov: 12461 ft: 15162 corp: 18/401b lim: 40 exec/s: 63 rss: 73Mb L: 34/39 MS: 1 ShuffleBytes- 00:08:32.718 [2024-11-17 04:25:11.299129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.718 [2024-11-17 04:25:11.299158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.718 [2024-11-17 04:25:11.299280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.718 [2024-11-17 04:25:11.299297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.718 #64 NEW cov: 12461 ft: 15200 corp: 19/420b lim: 40 exec/s: 64 rss: 73Mb L: 19/39 MS: 1 CopyPart- 00:08:32.718 [2024-11-17 04:25:11.339822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00400000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.718 [2024-11-17 04:25:11.339852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.718 [2024-11-17 04:25:11.339983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00600000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.718 [2024-11-17 04:25:11.339999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.718 [2024-11-17 04:25:11.340125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.718 [2024-11-17 04:25:11.340141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.718 [2024-11-17 04:25:11.340256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.718 [2024-11-17 04:25:11.340274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.718 #65 NEW cov: 12461 ft: 15218 corp: 20/454b lim: 40 exec/s: 65 rss: 73Mb L: 34/39 MS: 1 ChangeBit- 00:08:32.718 [2024-11-17 04:25:11.389392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.718 [2024-11-17 04:25:11.389419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.718 [2024-11-17 04:25:11.389544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.718 [2024-11-17 04:25:11.389560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.718 #66 NEW cov: 12461 ft: 15247 corp: 21/474b lim: 40 exec/s: 66 rss: 73Mb L: 20/39 MS: 1 ChangeByte- 00:08:32.718 [2024-11-17 04:25:11.459316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.718 [2024-11-17 04:25:11.459344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.718 #67 NEW cov: 12461 ft: 15286 corp: 22/486b lim: 40 exec/s: 67 rss: 73Mb L: 12/39 MS: 1 EraseBytes- 00:08:32.718 [2024-11-17 04:25:11.530394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.718 [2024-11-17 04:25:11.530422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.718 [2024-11-17 04:25:11.530557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00600000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.718 [2024-11-17 04:25:11.530573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.718 [2024-11-17 04:25:11.530707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.718 [2024-11-17 04:25:11.530723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.718 [2024-11-17 04:25:11.530842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.718 [2024-11-17 04:25:11.530860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.979 #68 NEW cov: 12461 ft: 15292 corp: 23/520b lim: 40 exec/s: 68 rss: 73Mb L: 34/39 MS: 1 ChangeBinInt- 00:08:32.979 [2024-11-17 04:25:11.579699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.979 [2024-11-17 04:25:11.579727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.979 #69 NEW cov: 12461 ft: 15324 corp: 24/529b lim: 40 exec/s: 69 rss: 73Mb L: 9/39 MS: 1 EraseBytes- 00:08:32.979 [2024-11-17 04:25:11.620674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000031 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.979 [2024-11-17 04:25:11.620705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.980 [2024-11-17 04:25:11.620835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00006000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.980 [2024-11-17 04:25:11.620853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.980 [2024-11-17 04:25:11.620979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.980 [2024-11-17 04:25:11.620996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.980 [2024-11-17 04:25:11.621117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.980 [2024-11-17 04:25:11.621134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.980 #70 NEW cov: 12461 ft: 15377 corp: 25/564b lim: 40 exec/s: 70 rss: 73Mb L: 35/39 MS: 1 ChangeBinInt- 00:08:32.980 [2024-11-17 04:25:11.690828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:31000000 cdw11:00313131 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.980 [2024-11-17 04:25:11.690855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.980 [2024-11-17 04:25:11.690992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:31313131 cdw11:31313131 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.980 [2024-11-17 04:25:11.691010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.980 [2024-11-17 04:25:11.691134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:31313131 cdw11:31313131 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.980 [2024-11-17 04:25:11.691152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.980 [2024-11-17 04:25:11.691270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.980 [2024-11-17 04:25:11.691287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.980 #71 NEW cov: 12461 ft: 15386 corp: 26/600b lim: 40 exec/s: 71 rss: 73Mb L: 36/39 MS: 1 InsertRepeatedBytes- 00:08:32.980 [2024-11-17 04:25:11.760297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.980 [2024-11-17 04:25:11.760324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.980 #72 NEW cov: 12461 ft: 15393 corp: 27/612b lim: 40 exec/s: 72 rss: 73Mb L: 12/39 MS: 1 ChangeByte- 00:08:33.240 [2024-11-17 04:25:11.831353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:31000000 cdw11:00313131 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.240 [2024-11-17 04:25:11.831384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.240 [2024-11-17 04:25:11.831522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:31313131 cdw11:31313100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.241 [2024-11-17 04:25:11.831539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.241 [2024-11-17 04:25:11.831660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00003131 cdw11:31313131 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.241 [2024-11-17 04:25:11.831678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.241 [2024-11-17 04:25:11.831810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:31313100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.241 [2024-11-17 04:25:11.831828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:33.241 #73 NEW cov: 12461 ft: 15410 corp: 28/651b lim: 40 exec/s: 73 rss: 74Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:08:33.241 [2024-11-17 04:25:11.900940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8a00008a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.241 [2024-11-17 04:25:11.900968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.241 [2024-11-17 04:25:11.901103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000031 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.241 [2024-11-17 04:25:11.901120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.241 #74 NEW cov: 12461 ft: 15426 corp: 29/672b lim: 40 exec/s: 74 rss: 74Mb L: 21/39 MS: 1 CopyPart- 00:08:33.241 [2024-11-17 04:25:11.971814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.241 [2024-11-17 04:25:11.971842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.241 [2024-11-17 04:25:11.971975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00600000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.241 [2024-11-17 04:25:11.971991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.241 [2024-11-17 04:25:11.972113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.241 [2024-11-17 04:25:11.972132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.241 [2024-11-17 04:25:11.972251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.241 [2024-11-17 04:25:11.972267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:33.241 #75 NEW cov: 12461 ft: 15439 corp: 30/710b lim: 40 exec/s: 75 rss: 74Mb L: 38/39 MS: 1 InsertRepeatedBytes- 00:08:33.241 [2024-11-17 04:25:12.021888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.241 [2024-11-17 04:25:12.021916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.241 [2024-11-17 04:25:12.022056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.241 [2024-11-17 04:25:12.022077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.241 [2024-11-17 04:25:12.022195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00003d00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.241 [2024-11-17 04:25:12.022212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.241 [2024-11-17 04:25:12.022324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.241 [2024-11-17 04:25:12.022341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:33.241 #76 NEW cov: 12461 ft: 15455 corp: 31/744b lim: 40 exec/s: 38 rss: 74Mb L: 34/39 MS: 1 InsertByte- 00:08:33.241 #76 DONE cov: 12461 ft: 15455 corp: 31/744b lim: 40 exec/s: 38 rss: 74Mb 00:08:33.241 Done 76 runs in 2 second(s) 00:08:33.502 04:25:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_12.conf /var/tmp/suppress_nvmf_fuzz 00:08:33.502 04:25:12 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:33.502 04:25:12 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:33.502 04:25:12 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:33.502 04:25:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:33.502 04:25:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:33.502 04:25:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:33.502 04:25:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:33.502 04:25:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:33.502 04:25:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:33.502 04:25:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:33.502 04:25:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 13 00:08:33.502 04:25:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4413 00:08:33.502 04:25:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:33.502 04:25:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:33.502 04:25:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:33.502 04:25:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:33.502 04:25:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:33.502 04:25:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 00:08:33.502 [2024-11-17 04:25:12.189671] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:33.502 [2024-11-17 04:25:12.189758] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid154321 ] 00:08:33.763 [2024-11-17 04:25:12.392437] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.763 [2024-11-17 04:25:12.406256] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.763 [2024-11-17 04:25:12.458642] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:33.763 [2024-11-17 04:25:12.474952] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:33.763 INFO: Running with entropic power schedule (0xFF, 100). 00:08:33.763 INFO: Seed: 4009604693 00:08:33.763 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:33.763 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:33.763 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:33.763 INFO: A corpus is not provided, starting from an empty corpus 00:08:33.763 #2 INITED exec/s: 0 rss: 65Mb 00:08:33.763 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:33.763 This may also happen if the target rejected all inputs we tried so far 00:08:33.763 [2024-11-17 04:25:12.534530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.763 [2024-11-17 04:25:12.534559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.763 [2024-11-17 04:25:12.534633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.763 [2024-11-17 04:25:12.534647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.763 [2024-11-17 04:25:12.534708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.763 [2024-11-17 04:25:12.534722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.023 NEW_FUNC[1/714]: 0x464cf8 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:34.023 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:34.023 #11 NEW cov: 12201 ft: 12223 corp: 2/25b lim: 40 exec/s: 0 rss: 74Mb L: 24/24 MS: 4 InsertByte-EraseBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:34.283 [2024-11-17 04:25:12.865458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.283 [2024-11-17 04:25:12.865516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.283 [2024-11-17 04:25:12.865602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.283 [2024-11-17 04:25:12.865629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.284 NEW_FUNC[1/1]: 0x1c474e8 in event_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:595 00:08:34.284 #12 NEW cov: 12335 ft: 13210 corp: 3/46b lim: 40 exec/s: 0 rss: 74Mb L: 21/24 MS: 1 EraseBytes- 00:08:34.284 [2024-11-17 04:25:12.935469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.284 [2024-11-17 04:25:12.935496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.284 [2024-11-17 04:25:12.935555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.284 [2024-11-17 04:25:12.935569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.284 [2024-11-17 04:25:12.935624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.284 [2024-11-17 04:25:12.935640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.284 #13 NEW cov: 12341 ft: 13372 corp: 4/76b lim: 40 exec/s: 0 rss: 74Mb L: 30/30 MS: 1 CrossOver- 00:08:34.284 [2024-11-17 04:25:12.975594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8f80e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.284 [2024-11-17 04:25:12.975619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.284 [2024-11-17 04:25:12.975690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:070707f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.284 [2024-11-17 04:25:12.975709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.284 [2024-11-17 04:25:12.975763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.284 [2024-11-17 04:25:12.975776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.284 #14 NEW cov: 12426 ft: 13628 corp: 5/100b lim: 40 exec/s: 0 rss: 74Mb L: 24/30 MS: 1 ChangeBinInt- 00:08:34.284 [2024-11-17 04:25:13.015692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.284 [2024-11-17 04:25:13.015723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.284 [2024-11-17 04:25:13.015781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.284 [2024-11-17 04:25:13.015795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.284 [2024-11-17 04:25:13.015851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.284 [2024-11-17 04:25:13.015865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.284 #15 NEW cov: 12426 ft: 13779 corp: 6/124b lim: 40 exec/s: 0 rss: 74Mb L: 24/30 MS: 1 ShuffleBytes- 00:08:34.284 [2024-11-17 04:25:13.055803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.284 [2024-11-17 04:25:13.055828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.284 [2024-11-17 04:25:13.055886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.284 [2024-11-17 04:25:13.055899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.284 [2024-11-17 04:25:13.055958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.284 [2024-11-17 04:25:13.055971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.284 #16 NEW cov: 12426 ft: 13814 corp: 7/149b lim: 40 exec/s: 0 rss: 74Mb L: 25/30 MS: 1 InsertRepeatedBytes- 00:08:34.284 [2024-11-17 04:25:13.095921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8fa0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.284 [2024-11-17 04:25:13.095947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.284 [2024-11-17 04:25:13.096007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:070707f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.284 [2024-11-17 04:25:13.096022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.284 [2024-11-17 04:25:13.096081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.284 [2024-11-17 04:25:13.096096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.545 #17 NEW cov: 12426 ft: 13880 corp: 8/173b lim: 40 exec/s: 0 rss: 74Mb L: 24/30 MS: 1 ChangeBit- 00:08:34.545 [2024-11-17 04:25:13.156090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.545 [2024-11-17 04:25:13.156116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.545 [2024-11-17 04:25:13.156173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.545 [2024-11-17 04:25:13.156187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.545 [2024-11-17 04:25:13.156261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.545 [2024-11-17 04:25:13.156275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.545 #18 NEW cov: 12426 ft: 14039 corp: 9/203b lim: 40 exec/s: 0 rss: 74Mb L: 30/30 MS: 1 ShuffleBytes- 00:08:34.545 [2024-11-17 04:25:13.216377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:01000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.545 [2024-11-17 04:25:13.216402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.545 [2024-11-17 04:25:13.216460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.545 [2024-11-17 04:25:13.216473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.545 [2024-11-17 04:25:13.216530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.545 [2024-11-17 04:25:13.216543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.545 [2024-11-17 04:25:13.216597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.545 [2024-11-17 04:25:13.216610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.545 #19 NEW cov: 12426 ft: 14565 corp: 10/241b lim: 40 exec/s: 0 rss: 74Mb L: 38/38 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:34.545 [2024-11-17 04:25:13.256095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:29858585 cdw11:85858585 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.545 [2024-11-17 04:25:13.256119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.545 #23 NEW cov: 12426 ft: 14884 corp: 11/250b lim: 40 exec/s: 0 rss: 74Mb L: 9/38 MS: 4 InsertByte-ChangeByte-ChangeBinInt-InsertRepeatedBytes- 00:08:34.545 [2024-11-17 04:25:13.296466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.545 [2024-11-17 04:25:13.296494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.545 [2024-11-17 04:25:13.296552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.545 [2024-11-17 04:25:13.296566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.545 [2024-11-17 04:25:13.296619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:f829f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.545 [2024-11-17 04:25:13.296632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.545 #24 NEW cov: 12426 ft: 14921 corp: 12/275b lim: 40 exec/s: 0 rss: 74Mb L: 25/38 MS: 1 InsertByte- 00:08:34.545 [2024-11-17 04:25:13.336562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8fa0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.545 [2024-11-17 04:25:13.336587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.545 [2024-11-17 04:25:13.336658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:070707f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.545 [2024-11-17 04:25:13.336672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.545 [2024-11-17 04:25:13.336728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:3af8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.545 [2024-11-17 04:25:13.336743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.806 #25 NEW cov: 12426 ft: 14960 corp: 13/299b lim: 40 exec/s: 0 rss: 74Mb L: 24/38 MS: 1 ChangeByte- 00:08:34.806 [2024-11-17 04:25:13.396863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8fa0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.806 [2024-11-17 04:25:13.396888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.806 [2024-11-17 04:25:13.396946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:070707f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.806 [2024-11-17 04:25:13.396961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.806 [2024-11-17 04:25:13.397017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:f8808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.806 [2024-11-17 04:25:13.397031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.806 [2024-11-17 04:25:13.397085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:80808080 cdw11:8080f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.806 [2024-11-17 04:25:13.397098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.806 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:34.806 #26 NEW cov: 12449 ft: 15063 corp: 14/332b lim: 40 exec/s: 0 rss: 74Mb L: 33/38 MS: 1 CrossOver- 00:08:34.806 [2024-11-17 04:25:13.437010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.806 [2024-11-17 04:25:13.437036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.806 [2024-11-17 04:25:13.437098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.806 [2024-11-17 04:25:13.437112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.806 [2024-11-17 04:25:13.437169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f8f8f8ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.806 [2024-11-17 04:25:13.437183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.806 [2024-11-17 04:25:13.437240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fffffff8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.806 [2024-11-17 04:25:13.437254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.806 #27 NEW cov: 12449 ft: 15103 corp: 15/370b lim: 40 exec/s: 0 rss: 75Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:34.806 [2024-11-17 04:25:13.496726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:29858529 cdw11:85858585 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.806 [2024-11-17 04:25:13.496752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.806 #28 NEW cov: 12449 ft: 15156 corp: 16/380b lim: 40 exec/s: 28 rss: 75Mb L: 10/38 MS: 1 InsertByte- 00:08:34.806 [2024-11-17 04:25:13.557173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.806 [2024-11-17 04:25:13.557198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.806 [2024-11-17 04:25:13.557272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.806 [2024-11-17 04:25:13.557287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.806 [2024-11-17 04:25:13.557345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f83ff8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.806 [2024-11-17 04:25:13.557358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.806 #29 NEW cov: 12449 ft: 15169 corp: 17/411b lim: 40 exec/s: 29 rss: 75Mb L: 31/38 MS: 1 InsertByte- 00:08:34.806 [2024-11-17 04:25:13.597268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.806 [2024-11-17 04:25:13.597293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.806 [2024-11-17 04:25:13.597350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:f8f80000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.806 [2024-11-17 04:25:13.597364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.806 [2024-11-17 04:25:13.597422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0006f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.806 [2024-11-17 04:25:13.597435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.067 #30 NEW cov: 12449 ft: 15176 corp: 18/436b lim: 40 exec/s: 30 rss: 75Mb L: 25/38 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\006"- 00:08:35.067 [2024-11-17 04:25:13.657453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8fe cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.067 [2024-11-17 04:25:13.657481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.067 [2024-11-17 04:25:13.657539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.067 [2024-11-17 04:25:13.657553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.067 [2024-11-17 04:25:13.657609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.067 [2024-11-17 04:25:13.657622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.067 #31 NEW cov: 12449 ft: 15188 corp: 19/460b lim: 40 exec/s: 31 rss: 75Mb L: 24/38 MS: 1 ChangeBinInt- 00:08:35.067 [2024-11-17 04:25:13.697445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.067 [2024-11-17 04:25:13.697469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.067 [2024-11-17 04:25:13.697527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.067 [2024-11-17 04:25:13.697541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.067 #32 NEW cov: 12449 ft: 15195 corp: 20/482b lim: 40 exec/s: 32 rss: 75Mb L: 22/38 MS: 1 EraseBytes- 00:08:35.067 [2024-11-17 04:25:13.737682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:fcf8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.067 [2024-11-17 04:25:13.737711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.067 [2024-11-17 04:25:13.737769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:f8f80000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.067 [2024-11-17 04:25:13.737783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.067 [2024-11-17 04:25:13.737841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0006f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.067 [2024-11-17 04:25:13.737854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.067 #33 NEW cov: 12449 ft: 15210 corp: 21/507b lim: 40 exec/s: 33 rss: 75Mb L: 25/38 MS: 1 ChangeBit- 00:08:35.067 [2024-11-17 04:25:13.797713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f80801 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.067 [2024-11-17 04:25:13.797738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.067 [2024-11-17 04:25:13.797794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.067 [2024-11-17 04:25:13.797807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.067 #34 NEW cov: 12449 ft: 15218 corp: 22/528b lim: 40 exec/s: 34 rss: 75Mb L: 21/38 MS: 1 ChangeBinInt- 00:08:35.067 [2024-11-17 04:25:13.858124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.067 [2024-11-17 04:25:13.858148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.067 [2024-11-17 04:25:13.858208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.067 [2024-11-17 04:25:13.858222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.067 [2024-11-17 04:25:13.858278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:85298585 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.067 [2024-11-17 04:25:13.858291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.067 [2024-11-17 04:25:13.858345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:8585854c cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.067 [2024-11-17 04:25:13.858358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:35.327 #35 NEW cov: 12449 ft: 15237 corp: 23/561b lim: 40 exec/s: 35 rss: 75Mb L: 33/38 MS: 1 CrossOver- 00:08:35.327 [2024-11-17 04:25:13.917944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:29858585 cdw11:7b858585 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.328 [2024-11-17 04:25:13.917969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.328 #36 NEW cov: 12449 ft: 15249 corp: 24/570b lim: 40 exec/s: 36 rss: 75Mb L: 9/38 MS: 1 ChangeBinInt- 00:08:35.328 [2024-11-17 04:25:13.958386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.328 [2024-11-17 04:25:13.958410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.328 [2024-11-17 04:25:13.958468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.328 [2024-11-17 04:25:13.958482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.328 [2024-11-17 04:25:13.958551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f83ff8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.328 [2024-11-17 04:25:13.958565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.328 [2024-11-17 04:25:13.958623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:bebebebe cdw11:bebebebe SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.328 [2024-11-17 04:25:13.958636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:35.328 #37 NEW cov: 12449 ft: 15262 corp: 25/609b lim: 40 exec/s: 37 rss: 75Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:08:35.328 [2024-11-17 04:25:14.018452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:01000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.328 [2024-11-17 04:25:14.018476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.328 [2024-11-17 04:25:14.018532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:f8f80000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.328 [2024-11-17 04:25:14.018546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.328 [2024-11-17 04:25:14.018603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0006f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.328 [2024-11-17 04:25:14.018619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.328 #38 NEW cov: 12449 ft: 15270 corp: 26/634b lim: 40 exec/s: 38 rss: 75Mb L: 25/39 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:35.328 [2024-11-17 04:25:14.058582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.328 [2024-11-17 04:25:14.058606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.328 [2024-11-17 04:25:14.058664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:3ff8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.328 [2024-11-17 04:25:14.058677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.328 [2024-11-17 04:25:14.058727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.328 [2024-11-17 04:25:14.058741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.328 #39 NEW cov: 12449 ft: 15295 corp: 27/660b lim: 40 exec/s: 39 rss: 75Mb L: 26/39 MS: 1 EraseBytes- 00:08:35.328 [2024-11-17 04:25:14.098806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.328 [2024-11-17 04:25:14.098830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.328 [2024-11-17 04:25:14.098890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.328 [2024-11-17 04:25:14.098904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.328 [2024-11-17 04:25:14.098961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f83ff8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.328 [2024-11-17 04:25:14.098974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.328 [2024-11-17 04:25:14.099027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:bebebebe cdw11:bebebebe SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.328 [2024-11-17 04:25:14.099041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:35.328 #40 NEW cov: 12449 ft: 15341 corp: 28/699b lim: 40 exec/s: 40 rss: 75Mb L: 39/39 MS: 1 ChangeByte- 00:08:35.589 [2024-11-17 04:25:14.158883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a939393 cdw11:0af8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.589 [2024-11-17 04:25:14.158909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.589 [2024-11-17 04:25:14.158965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.589 [2024-11-17 04:25:14.158979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.589 [2024-11-17 04:25:14.159035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:93f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.589 [2024-11-17 04:25:14.159049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.589 #44 NEW cov: 12449 ft: 15363 corp: 29/727b lim: 40 exec/s: 44 rss: 75Mb L: 28/39 MS: 4 CrossOver-InsertByte-InsertRepeatedBytes-CrossOver- 00:08:35.589 [2024-11-17 04:25:14.199023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8fe cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.589 [2024-11-17 04:25:14.199048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.589 [2024-11-17 04:25:14.199107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.589 [2024-11-17 04:25:14.199120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.589 [2024-11-17 04:25:14.199176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f8f8efef cdw11:efefefef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.589 [2024-11-17 04:25:14.199189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.589 #45 NEW cov: 12449 ft: 15400 corp: 30/757b lim: 40 exec/s: 45 rss: 75Mb L: 30/39 MS: 1 InsertRepeatedBytes- 00:08:35.589 [2024-11-17 04:25:14.259159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8f80e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.589 [2024-11-17 04:25:14.259184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.589 [2024-11-17 04:25:14.259239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:071707f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.589 [2024-11-17 04:25:14.259253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.589 [2024-11-17 04:25:14.259326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.589 [2024-11-17 04:25:14.259340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.589 #51 NEW cov: 12449 ft: 15412 corp: 31/781b lim: 40 exec/s: 51 rss: 76Mb L: 24/39 MS: 1 ChangeBit- 00:08:35.589 [2024-11-17 04:25:14.299356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:01000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.589 [2024-11-17 04:25:14.299380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.589 [2024-11-17 04:25:14.299436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.589 [2024-11-17 04:25:14.299449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.589 [2024-11-17 04:25:14.299507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.589 [2024-11-17 04:25:14.299520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.589 [2024-11-17 04:25:14.299579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.589 [2024-11-17 04:25:14.299592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:35.589 #52 NEW cov: 12449 ft: 15435 corp: 32/819b lim: 40 exec/s: 52 rss: 76Mb L: 38/39 MS: 1 ShuffleBytes- 00:08:35.589 [2024-11-17 04:25:14.359298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af80000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.589 [2024-11-17 04:25:14.359325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.589 [2024-11-17 04:25:14.359384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00f8f800 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.589 [2024-11-17 04:25:14.359397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.589 #53 NEW cov: 12449 ft: 15442 corp: 33/841b lim: 40 exec/s: 53 rss: 76Mb L: 22/39 MS: 1 EraseBytes- 00:08:35.850 [2024-11-17 04:25:14.419480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af80000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.850 [2024-11-17 04:25:14.419506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.850 [2024-11-17 04:25:14.419565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00f80000 cdw11:f8000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.850 [2024-11-17 04:25:14.419579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.850 #54 NEW cov: 12449 ft: 15487 corp: 34/863b lim: 40 exec/s: 54 rss: 76Mb L: 22/39 MS: 1 ShuffleBytes- 00:08:35.850 [2024-11-17 04:25:14.479776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.850 [2024-11-17 04:25:14.479801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.850 [2024-11-17 04:25:14.479861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.850 [2024-11-17 04:25:14.479874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.850 [2024-11-17 04:25:14.479929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.850 [2024-11-17 04:25:14.479943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.850 #55 NEW cov: 12449 ft: 15499 corp: 35/892b lim: 40 exec/s: 55 rss: 76Mb L: 29/39 MS: 1 EraseBytes- 00:08:35.850 [2024-11-17 04:25:14.519846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.850 [2024-11-17 04:25:14.519871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.850 [2024-11-17 04:25:14.519928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.850 [2024-11-17 04:25:14.519942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.850 [2024-11-17 04:25:14.520015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:f8f824f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.850 [2024-11-17 04:25:14.520029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.850 #56 NEW cov: 12449 ft: 15527 corp: 36/916b lim: 40 exec/s: 28 rss: 76Mb L: 24/39 MS: 1 ChangeByte- 00:08:35.850 #56 DONE cov: 12449 ft: 15527 corp: 36/916b lim: 40 exec/s: 28 rss: 76Mb 00:08:35.850 ###### Recommended dictionary. ###### 00:08:35.850 "\001\000\000\000\000\000\000\000" # Uses: 1 00:08:35.850 "\000\000\000\000\000\000\000\006" # Uses: 0 00:08:35.850 ###### End of recommended dictionary. ###### 00:08:35.850 Done 56 runs in 2 second(s) 00:08:35.850 04:25:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_13.conf /var/tmp/suppress_nvmf_fuzz 00:08:35.850 04:25:14 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:35.850 04:25:14 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:35.850 04:25:14 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:35.850 04:25:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:35.850 04:25:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:35.850 04:25:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:35.850 04:25:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:35.850 04:25:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:35.850 04:25:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:35.850 04:25:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:35.850 04:25:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 14 00:08:35.850 04:25:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4414 00:08:35.851 04:25:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:35.851 04:25:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:35.851 04:25:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:35.851 04:25:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:35.851 04:25:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:35.851 04:25:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 00:08:36.111 [2024-11-17 04:25:14.684820] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:36.111 [2024-11-17 04:25:14.684888] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid154617 ] 00:08:36.111 [2024-11-17 04:25:14.892787] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.111 [2024-11-17 04:25:14.906209] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.372 [2024-11-17 04:25:14.958687] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:36.372 [2024-11-17 04:25:14.975019] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:36.372 INFO: Running with entropic power schedule (0xFF, 100). 00:08:36.372 INFO: Seed: 2214643279 00:08:36.372 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:36.372 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:36.372 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:36.372 INFO: A corpus is not provided, starting from an empty corpus 00:08:36.372 #2 INITED exec/s: 0 rss: 65Mb 00:08:36.372 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:36.372 This may also happen if the target rejected all inputs we tried so far 00:08:36.372 [2024-11-17 04:25:15.023730] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.372 [2024-11-17 04:25:15.023764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.632 NEW_FUNC[1/716]: 0x4668c8 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:36.632 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:36.632 #9 NEW cov: 12216 ft: 12208 corp: 2/8b lim: 35 exec/s: 0 rss: 72Mb L: 7/7 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:36.632 [2024-11-17 04:25:15.384655] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.632 [2024-11-17 04:25:15.384710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.632 #10 NEW cov: 12329 ft: 12787 corp: 3/16b lim: 35 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 InsertByte- 00:08:36.892 [2024-11-17 04:25:15.474740] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.892 [2024-11-17 04:25:15.474773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.892 #21 NEW cov: 12335 ft: 13014 corp: 4/24b lim: 35 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 ShuffleBytes- 00:08:36.892 [2024-11-17 04:25:15.554904] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.892 [2024-11-17 04:25:15.554934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.892 #22 NEW cov: 12427 ft: 13375 corp: 5/32b lim: 35 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 ChangeByte- 00:08:36.892 [2024-11-17 04:25:15.615064] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.892 [2024-11-17 04:25:15.615093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.892 #23 NEW cov: 12427 ft: 13502 corp: 6/40b lim: 35 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 ChangeBit- 00:08:36.892 [2024-11-17 04:25:15.705287] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.892 [2024-11-17 04:25:15.705317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.152 #24 NEW cov: 12427 ft: 13598 corp: 7/49b lim: 35 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 InsertByte- 00:08:37.152 [2024-11-17 04:25:15.755424] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.152 [2024-11-17 04:25:15.755454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.152 #27 NEW cov: 12427 ft: 13644 corp: 8/58b lim: 35 exec/s: 0 rss: 72Mb L: 9/9 MS: 3 ChangeBit-ShuffleBytes-CMP- DE: "\377\211v\232$\"\023\266"- 00:08:37.152 [2024-11-17 04:25:15.805677] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.152 [2024-11-17 04:25:15.805719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.152 [2024-11-17 04:25:15.805769] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.152 [2024-11-17 04:25:15.805786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.152 [2024-11-17 04:25:15.805816] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.152 [2024-11-17 04:25:15.805832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.152 #29 NEW cov: 12427 ft: 14387 corp: 9/79b lim: 35 exec/s: 0 rss: 72Mb L: 21/21 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:37.152 [2024-11-17 04:25:15.865751] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.152 [2024-11-17 04:25:15.865787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.152 #30 NEW cov: 12427 ft: 14432 corp: 10/88b lim: 35 exec/s: 0 rss: 72Mb L: 9/21 MS: 1 CMP- DE: "\000\014"- 00:08:37.152 [2024-11-17 04:25:15.925894] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.152 [2024-11-17 04:25:15.925924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.412 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:37.412 #31 NEW cov: 12450 ft: 14535 corp: 11/96b lim: 35 exec/s: 0 rss: 72Mb L: 8/21 MS: 1 ChangeByte- 00:08:37.412 [2024-11-17 04:25:16.016230] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.412 [2024-11-17 04:25:16.016259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.412 [2024-11-17 04:25:16.016307] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.412 [2024-11-17 04:25:16.016323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.412 [2024-11-17 04:25:16.016353] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.412 [2024-11-17 04:25:16.016368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.412 #32 NEW cov: 12450 ft: 14591 corp: 12/117b lim: 35 exec/s: 32 rss: 72Mb L: 21/21 MS: 1 ShuffleBytes- 00:08:37.412 [2024-11-17 04:25:16.106344] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.412 [2024-11-17 04:25:16.106373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.412 #33 NEW cov: 12450 ft: 14595 corp: 13/126b lim: 35 exec/s: 33 rss: 72Mb L: 9/21 MS: 1 InsertByte- 00:08:37.412 [2024-11-17 04:25:16.156656] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.412 [2024-11-17 04:25:16.156685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.412 [2024-11-17 04:25:16.156740] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.412 [2024-11-17 04:25:16.156756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.412 [2024-11-17 04:25:16.156786] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.412 [2024-11-17 04:25:16.156802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.413 [2024-11-17 04:25:16.156831] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.413 [2024-11-17 04:25:16.156847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:37.413 #34 NEW cov: 12450 ft: 14942 corp: 14/154b lim: 35 exec/s: 34 rss: 72Mb L: 28/28 MS: 1 CrossOver- 00:08:37.413 [2024-11-17 04:25:16.216633] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.413 [2024-11-17 04:25:16.216664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.673 #35 NEW cov: 12450 ft: 14994 corp: 15/166b lim: 35 exec/s: 35 rss: 73Mb L: 12/28 MS: 1 InsertRepeatedBytes- 00:08:37.673 [2024-11-17 04:25:16.306874] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.673 [2024-11-17 04:25:16.306904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.673 #36 NEW cov: 12450 ft: 15013 corp: 16/175b lim: 35 exec/s: 36 rss: 73Mb L: 9/28 MS: 1 InsertByte- 00:08:37.674 [2024-11-17 04:25:16.357057] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.674 [2024-11-17 04:25:16.357090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.674 #37 NEW cov: 12450 ft: 15026 corp: 17/187b lim: 35 exec/s: 37 rss: 73Mb L: 12/28 MS: 1 ChangeByte- 00:08:37.674 [2024-11-17 04:25:16.447282] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.674 [2024-11-17 04:25:16.447315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.674 #38 NEW cov: 12450 ft: 15052 corp: 18/199b lim: 35 exec/s: 38 rss: 73Mb L: 12/28 MS: 1 CrossOver- 00:08:37.674 [2024-11-17 04:25:16.497444] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.674 [2024-11-17 04:25:16.497476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.934 #39 NEW cov: 12450 ft: 15077 corp: 19/208b lim: 35 exec/s: 39 rss: 73Mb L: 9/28 MS: 1 InsertByte- 00:08:37.934 [2024-11-17 04:25:16.547828] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.934 [2024-11-17 04:25:16.547859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.934 [2024-11-17 04:25:16.547893] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000005e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.934 [2024-11-17 04:25:16.547909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.934 [2024-11-17 04:25:16.547938] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000005e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.934 [2024-11-17 04:25:16.547954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.934 [2024-11-17 04:25:16.547983] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.934 [2024-11-17 04:25:16.547998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:37.934 [2024-11-17 04:25:16.548027] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.934 [2024-11-17 04:25:16.548042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:37.934 #45 NEW cov: 12450 ft: 15222 corp: 20/243b lim: 35 exec/s: 45 rss: 73Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:37.934 [2024-11-17 04:25:16.607727] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.934 [2024-11-17 04:25:16.607758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.934 #46 NEW cov: 12450 ft: 15235 corp: 21/252b lim: 35 exec/s: 46 rss: 73Mb L: 9/35 MS: 1 ChangeByte- 00:08:37.934 [2024-11-17 04:25:16.657852] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.934 [2024-11-17 04:25:16.657882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.934 #47 NEW cov: 12450 ft: 15254 corp: 22/261b lim: 35 exec/s: 47 rss: 73Mb L: 9/35 MS: 1 ChangeBinInt- 00:08:37.934 [2024-11-17 04:25:16.748042] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.934 [2024-11-17 04:25:16.748072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.194 #48 NEW cov: 12450 ft: 15297 corp: 23/269b lim: 35 exec/s: 48 rss: 73Mb L: 8/35 MS: 1 ChangeBit- 00:08:38.194 [2024-11-17 04:25:16.798361] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000058 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.194 [2024-11-17 04:25:16.798392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.194 [2024-11-17 04:25:16.798441] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.194 [2024-11-17 04:25:16.798458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.194 [2024-11-17 04:25:16.798489] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.194 [2024-11-17 04:25:16.798506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.194 #52 NEW cov: 12450 ft: 15321 corp: 24/295b lim: 35 exec/s: 52 rss: 73Mb L: 26/35 MS: 4 ChangeByte-InsertByte-InsertByte-InsertRepeatedBytes- 00:08:38.194 [2024-11-17 04:25:16.848367] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.194 [2024-11-17 04:25:16.848397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.194 [2024-11-17 04:25:16.848444] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.194 [2024-11-17 04:25:16.848461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.194 #53 NEW cov: 12450 ft: 15522 corp: 25/311b lim: 35 exec/s: 53 rss: 73Mb L: 16/35 MS: 1 PersAutoDict- DE: "\377\211v\232$\"\023\266"- 00:08:38.194 [2024-11-17 04:25:16.908488] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.194 [2024-11-17 04:25:16.908519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.194 #54 NEW cov: 12450 ft: 15549 corp: 26/320b lim: 35 exec/s: 54 rss: 73Mb L: 9/35 MS: 1 PersAutoDict- DE: "\377\211v\232$\"\023\266"- 00:08:38.194 [2024-11-17 04:25:16.958587] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.194 [2024-11-17 04:25:16.958617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.455 #55 NEW cov: 12450 ft: 15590 corp: 27/328b lim: 35 exec/s: 27 rss: 73Mb L: 8/35 MS: 1 EraseBytes- 00:08:38.455 #55 DONE cov: 12450 ft: 15590 corp: 27/328b lim: 35 exec/s: 27 rss: 73Mb 00:08:38.455 ###### Recommended dictionary. ###### 00:08:38.455 "\377\211v\232$\"\023\266" # Uses: 2 00:08:38.455 "\000\014" # Uses: 1 00:08:38.455 ###### End of recommended dictionary. ###### 00:08:38.455 Done 55 runs in 2 second(s) 00:08:38.455 04:25:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_14.conf /var/tmp/suppress_nvmf_fuzz 00:08:38.455 04:25:17 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:38.455 04:25:17 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:38.455 04:25:17 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:38.455 04:25:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:38.455 04:25:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:38.455 04:25:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:38.455 04:25:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:38.455 04:25:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:38.455 04:25:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:38.455 04:25:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:38.455 04:25:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 15 00:08:38.455 04:25:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4415 00:08:38.455 04:25:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:38.455 04:25:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:38.455 04:25:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:38.455 04:25:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:38.455 04:25:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:38.455 04:25:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 00:08:38.455 [2024-11-17 04:25:17.179923] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:38.455 [2024-11-17 04:25:17.180019] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid155147 ] 00:08:38.715 [2024-11-17 04:25:17.378358] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.715 [2024-11-17 04:25:17.390954] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.715 [2024-11-17 04:25:17.443247] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:38.715 [2024-11-17 04:25:17.459560] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:38.715 INFO: Running with entropic power schedule (0xFF, 100). 00:08:38.715 INFO: Seed: 403667662 00:08:38.715 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:38.715 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:38.715 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:38.716 INFO: A corpus is not provided, starting from an empty corpus 00:08:38.716 #2 INITED exec/s: 0 rss: 65Mb 00:08:38.716 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:38.716 This may also happen if the target rejected all inputs we tried so far 00:08:38.716 [2024-11-17 04:25:17.530233] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.716 [2024-11-17 04:25:17.530270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.716 [2024-11-17 04:25:17.530416] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.716 [2024-11-17 04:25:17.530434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.716 [2024-11-17 04:25:17.530572] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.716 [2024-11-17 04:25:17.530594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.716 [2024-11-17 04:25:17.530739] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.716 [2024-11-17 04:25:17.530756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:38.716 [2024-11-17 04:25:17.530888] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.716 [2024-11-17 04:25:17.530907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:39.236 NEW_FUNC[1/715]: 0x467e08 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:39.236 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:39.236 #5 NEW cov: 12204 ft: 12182 corp: 2/36b lim: 35 exec/s: 0 rss: 72Mb L: 35/35 MS: 3 ChangeBinInt-InsertByte-InsertRepeatedBytes- 00:08:39.236 [2024-11-17 04:25:17.871028] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:17.871069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.236 [2024-11-17 04:25:17.871198] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:17.871216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.236 [2024-11-17 04:25:17.871340] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:17.871359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.236 [2024-11-17 04:25:17.871492] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:17.871509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:39.236 [2024-11-17 04:25:17.871618] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:17.871637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:39.236 #6 NEW cov: 12317 ft: 12852 corp: 3/71b lim: 35 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:39.236 [2024-11-17 04:25:17.941189] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:17.941222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.236 [2024-11-17 04:25:17.941349] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:17.941367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.236 [2024-11-17 04:25:17.941491] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:17.941510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.236 [2024-11-17 04:25:17.941643] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:17.941664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:39.236 [2024-11-17 04:25:17.941815] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:17.941843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:39.236 #7 NEW cov: 12323 ft: 13078 corp: 4/106b lim: 35 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:39.236 [2024-11-17 04:25:17.991298] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:17.991329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.236 [2024-11-17 04:25:17.991459] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:17.991477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.236 [2024-11-17 04:25:17.991599] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:17.991618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.236 [2024-11-17 04:25:17.991744] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:17.991761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:39.236 [2024-11-17 04:25:17.991896] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:17.991914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:39.236 #13 NEW cov: 12408 ft: 13312 corp: 5/141b lim: 35 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 CopyPart- 00:08:39.236 [2024-11-17 04:25:18.061493] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:18.061523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.236 [2024-11-17 04:25:18.061663] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:18.061683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.236 [2024-11-17 04:25:18.061819] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:18.061837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.236 [2024-11-17 04:25:18.061978] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:18.061996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:39.236 [2024-11-17 04:25:18.062125] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.236 [2024-11-17 04:25:18.062142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:39.497 #14 NEW cov: 12408 ft: 13380 corp: 6/176b lim: 35 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:39.497 [2024-11-17 04:25:18.131733] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.497 [2024-11-17 04:25:18.131766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.497 [2024-11-17 04:25:18.131895] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.497 [2024-11-17 04:25:18.131912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.497 [2024-11-17 04:25:18.132039] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.497 [2024-11-17 04:25:18.132055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.497 [2024-11-17 04:25:18.132197] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.497 [2024-11-17 04:25:18.132215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:39.497 [2024-11-17 04:25:18.132344] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.497 [2024-11-17 04:25:18.132363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:39.497 #15 NEW cov: 12408 ft: 13476 corp: 7/211b lim: 35 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 CopyPart- 00:08:39.497 [2024-11-17 04:25:18.171776] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.497 [2024-11-17 04:25:18.171805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.497 [2024-11-17 04:25:18.171937] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.497 [2024-11-17 04:25:18.171954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.497 [2024-11-17 04:25:18.172083] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.497 [2024-11-17 04:25:18.172101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.497 [2024-11-17 04:25:18.172224] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.497 [2024-11-17 04:25:18.172242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:39.497 [2024-11-17 04:25:18.172372] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.497 [2024-11-17 04:25:18.172390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:39.497 #16 NEW cov: 12408 ft: 13597 corp: 8/246b lim: 35 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:39.497 [2024-11-17 04:25:18.231553] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.497 [2024-11-17 04:25:18.231583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.497 [2024-11-17 04:25:18.231707] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.497 [2024-11-17 04:25:18.231728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.497 [2024-11-17 04:25:18.231860] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.497 [2024-11-17 04:25:18.231880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.497 #17 NEW cov: 12408 ft: 14123 corp: 9/271b lim: 35 exec/s: 0 rss: 72Mb L: 25/35 MS: 1 CrossOver- 00:08:39.497 [2024-11-17 04:25:18.302143] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.497 [2024-11-17 04:25:18.302170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.497 [2024-11-17 04:25:18.302309] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.497 [2024-11-17 04:25:18.302328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.497 [2024-11-17 04:25:18.302450] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.497 [2024-11-17 04:25:18.302483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.497 [2024-11-17 04:25:18.302610] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.497 [2024-11-17 04:25:18.302629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:39.497 [2024-11-17 04:25:18.302765] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.497 [2024-11-17 04:25:18.302783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:39.758 #18 NEW cov: 12408 ft: 14166 corp: 10/306b lim: 35 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 ChangeBit- 00:08:39.758 [2024-11-17 04:25:18.371964] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.758 [2024-11-17 04:25:18.371990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.758 [2024-11-17 04:25:18.372128] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.758 [2024-11-17 04:25:18.372146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.758 [2024-11-17 04:25:18.372276] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.758 [2024-11-17 04:25:18.372296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.758 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:39.758 #19 NEW cov: 12431 ft: 14308 corp: 11/328b lim: 35 exec/s: 0 rss: 73Mb L: 22/35 MS: 1 EraseBytes- 00:08:39.758 [2024-11-17 04:25:18.442211] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.758 [2024-11-17 04:25:18.442239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.758 [2024-11-17 04:25:18.442362] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.758 [2024-11-17 04:25:18.442377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.758 [2024-11-17 04:25:18.442508] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.758 [2024-11-17 04:25:18.442525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.758 #20 NEW cov: 12431 ft: 14322 corp: 12/355b lim: 35 exec/s: 0 rss: 73Mb L: 27/35 MS: 1 CrossOver- 00:08:39.758 [2024-11-17 04:25:18.512352] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.758 [2024-11-17 04:25:18.512379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.758 [2024-11-17 04:25:18.512510] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.758 [2024-11-17 04:25:18.512527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.758 [2024-11-17 04:25:18.512656] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.758 [2024-11-17 04:25:18.512673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.758 #21 NEW cov: 12431 ft: 14368 corp: 13/380b lim: 35 exec/s: 21 rss: 73Mb L: 25/35 MS: 1 ChangeBit- 00:08:39.758 [2024-11-17 04:25:18.582879] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.758 [2024-11-17 04:25:18.582906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.758 [2024-11-17 04:25:18.583036] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.758 [2024-11-17 04:25:18.583055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.758 [2024-11-17 04:25:18.583181] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.758 [2024-11-17 04:25:18.583196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.758 [2024-11-17 04:25:18.583335] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.758 [2024-11-17 04:25:18.583352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:39.758 [2024-11-17 04:25:18.583479] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.758 [2024-11-17 04:25:18.583496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:40.018 #22 NEW cov: 12431 ft: 14408 corp: 14/415b lim: 35 exec/s: 22 rss: 73Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:40.018 [2024-11-17 04:25:18.633161] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.018 [2024-11-17 04:25:18.633187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.018 [2024-11-17 04:25:18.633322] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.018 [2024-11-17 04:25:18.633339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.018 [2024-11-17 04:25:18.633468] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.018 [2024-11-17 04:25:18.633485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.018 [2024-11-17 04:25:18.633617] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.019 [2024-11-17 04:25:18.633635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.019 [2024-11-17 04:25:18.633770] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.019 [2024-11-17 04:25:18.633786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:40.019 #28 NEW cov: 12431 ft: 14423 corp: 15/450b lim: 35 exec/s: 28 rss: 73Mb L: 35/35 MS: 1 CrossOver- 00:08:40.019 [2024-11-17 04:25:18.682918] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.019 [2024-11-17 04:25:18.682945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.019 [2024-11-17 04:25:18.683075] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.019 [2024-11-17 04:25:18.683092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.019 [2024-11-17 04:25:18.683224] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.019 [2024-11-17 04:25:18.683240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.019 #29 NEW cov: 12431 ft: 14447 corp: 16/471b lim: 35 exec/s: 29 rss: 73Mb L: 21/35 MS: 1 EraseBytes- 00:08:40.019 [2024-11-17 04:25:18.753057] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.019 [2024-11-17 04:25:18.753084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.019 [2024-11-17 04:25:18.753218] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.019 [2024-11-17 04:25:18.753234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.019 [2024-11-17 04:25:18.753364] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.019 [2024-11-17 04:25:18.753383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.019 #30 NEW cov: 12431 ft: 14513 corp: 17/496b lim: 35 exec/s: 30 rss: 73Mb L: 25/35 MS: 1 ChangeBinInt- 00:08:40.019 [2024-11-17 04:25:18.813652] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.019 [2024-11-17 04:25:18.813678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.019 [2024-11-17 04:25:18.813799] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.019 [2024-11-17 04:25:18.813816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.019 [2024-11-17 04:25:18.813946] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.019 [2024-11-17 04:25:18.813964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.019 [2024-11-17 04:25:18.814092] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.019 [2024-11-17 04:25:18.814109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.019 [2024-11-17 04:25:18.814238] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.019 [2024-11-17 04:25:18.814256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:40.019 #31 NEW cov: 12431 ft: 14566 corp: 18/531b lim: 35 exec/s: 31 rss: 73Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:40.279 [2024-11-17 04:25:18.863822] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.279 [2024-11-17 04:25:18.863849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.279 [2024-11-17 04:25:18.863986] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.279 [2024-11-17 04:25:18.864003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.279 [2024-11-17 04:25:18.864139] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.279 [2024-11-17 04:25:18.864157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.279 [2024-11-17 04:25:18.864292] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.279 [2024-11-17 04:25:18.864309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.279 [2024-11-17 04:25:18.864437] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.279 [2024-11-17 04:25:18.864455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:40.279 #32 NEW cov: 12431 ft: 14602 corp: 19/566b lim: 35 exec/s: 32 rss: 73Mb L: 35/35 MS: 1 ChangeBit- 00:08:40.280 [2024-11-17 04:25:18.933145] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.280 [2024-11-17 04:25:18.933173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.280 #33 NEW cov: 12431 ft: 14934 corp: 20/576b lim: 35 exec/s: 33 rss: 73Mb L: 10/35 MS: 1 CrossOver- 00:08:40.280 [2024-11-17 04:25:18.984262] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.280 [2024-11-17 04:25:18.984288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.280 [2024-11-17 04:25:18.984427] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.280 [2024-11-17 04:25:18.984444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.280 [2024-11-17 04:25:18.984581] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.280 [2024-11-17 04:25:18.984600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.280 [2024-11-17 04:25:18.984722] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.280 [2024-11-17 04:25:18.984739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.280 [2024-11-17 04:25:18.984870] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.280 [2024-11-17 04:25:18.984886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:40.280 #34 NEW cov: 12431 ft: 14982 corp: 21/611b lim: 35 exec/s: 34 rss: 73Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:40.280 [2024-11-17 04:25:19.034428] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.280 [2024-11-17 04:25:19.034456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.280 [2024-11-17 04:25:19.034589] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.280 [2024-11-17 04:25:19.034607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.280 [2024-11-17 04:25:19.034750] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.280 [2024-11-17 04:25:19.034766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.280 [2024-11-17 04:25:19.034893] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.280 [2024-11-17 04:25:19.034910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.280 [2024-11-17 04:25:19.035039] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.280 [2024-11-17 04:25:19.035055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:40.280 #35 NEW cov: 12431 ft: 15015 corp: 22/646b lim: 35 exec/s: 35 rss: 73Mb L: 35/35 MS: 1 ChangeBit- 00:08:40.280 [2024-11-17 04:25:19.074478] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.280 [2024-11-17 04:25:19.074505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.280 [2024-11-17 04:25:19.074634] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.280 [2024-11-17 04:25:19.074651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.280 [2024-11-17 04:25:19.074778] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.280 [2024-11-17 04:25:19.074795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.280 [2024-11-17 04:25:19.074923] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.280 [2024-11-17 04:25:19.074939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.280 [2024-11-17 04:25:19.075068] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.280 [2024-11-17 04:25:19.075086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:40.280 #36 NEW cov: 12431 ft: 15025 corp: 23/681b lim: 35 exec/s: 36 rss: 73Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:40.539 [2024-11-17 04:25:19.113754] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.539 [2024-11-17 04:25:19.113784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.539 #37 NEW cov: 12431 ft: 15032 corp: 24/691b lim: 35 exec/s: 37 rss: 73Mb L: 10/35 MS: 1 CrossOver- 00:08:40.539 [2024-11-17 04:25:19.184395] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.539 [2024-11-17 04:25:19.184423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.539 [2024-11-17 04:25:19.184554] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.539 [2024-11-17 04:25:19.184572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.539 [2024-11-17 04:25:19.184698] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.539 [2024-11-17 04:25:19.184715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.539 #42 NEW cov: 12431 ft: 15129 corp: 25/718b lim: 35 exec/s: 42 rss: 73Mb L: 27/35 MS: 5 CrossOver-InsertByte-ChangeByte-ChangeBinInt-CrossOver- 00:08:40.539 [2024-11-17 04:25:19.235040] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.539 [2024-11-17 04:25:19.235069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.539 [2024-11-17 04:25:19.235202] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007bc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.539 [2024-11-17 04:25:19.235220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.539 [2024-11-17 04:25:19.235350] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.539 [2024-11-17 04:25:19.235366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.539 [2024-11-17 04:25:19.235492] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.539 [2024-11-17 04:25:19.235509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.539 [2024-11-17 04:25:19.235625] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.539 [2024-11-17 04:25:19.235641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:40.539 #43 NEW cov: 12431 ft: 15150 corp: 26/753b lim: 35 exec/s: 43 rss: 73Mb L: 35/35 MS: 1 ChangeByte- 00:08:40.539 [2024-11-17 04:25:19.305322] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.539 [2024-11-17 04:25:19.305352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.539 [2024-11-17 04:25:19.305477] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.539 [2024-11-17 04:25:19.305503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.539 [2024-11-17 04:25:19.305639] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.539 [2024-11-17 04:25:19.305656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.539 [2024-11-17 04:25:19.305790] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.539 [2024-11-17 04:25:19.305809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.539 [2024-11-17 04:25:19.305934] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.539 [2024-11-17 04:25:19.305953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:40.539 #44 NEW cov: 12431 ft: 15187 corp: 27/788b lim: 35 exec/s: 44 rss: 74Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:40.798 [2024-11-17 04:25:19.374953] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.798 [2024-11-17 04:25:19.374983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.798 [2024-11-17 04:25:19.375124] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.798 [2024-11-17 04:25:19.375142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.798 [2024-11-17 04:25:19.375268] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.798 [2024-11-17 04:25:19.375286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.798 #45 NEW cov: 12431 ft: 15202 corp: 28/813b lim: 35 exec/s: 45 rss: 74Mb L: 25/35 MS: 1 CrossOver- 00:08:40.798 [2024-11-17 04:25:19.445542] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.798 [2024-11-17 04:25:19.445569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.798 [2024-11-17 04:25:19.445699] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.798 [2024-11-17 04:25:19.445717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.798 [2024-11-17 04:25:19.445839] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.798 [2024-11-17 04:25:19.445856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.798 [2024-11-17 04:25:19.445982] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.798 [2024-11-17 04:25:19.445999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.798 [2024-11-17 04:25:19.446130] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.798 [2024-11-17 04:25:19.446150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:40.798 #46 NEW cov: 12431 ft: 15223 corp: 29/848b lim: 35 exec/s: 46 rss: 74Mb L: 35/35 MS: 1 CopyPart- 00:08:40.798 [2024-11-17 04:25:19.495765] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.798 [2024-11-17 04:25:19.495795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.798 [2024-11-17 04:25:19.495933] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.798 [2024-11-17 04:25:19.495952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.799 [2024-11-17 04:25:19.496089] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.799 [2024-11-17 04:25:19.496107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.799 [2024-11-17 04:25:19.496232] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.799 [2024-11-17 04:25:19.496252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.799 [2024-11-17 04:25:19.496388] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.799 [2024-11-17 04:25:19.496406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:40.799 #47 NEW cov: 12431 ft: 15248 corp: 30/883b lim: 35 exec/s: 23 rss: 74Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:40.799 #47 DONE cov: 12431 ft: 15248 corp: 30/883b lim: 35 exec/s: 23 rss: 74Mb 00:08:40.799 Done 47 runs in 2 second(s) 00:08:40.799 04:25:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_15.conf /var/tmp/suppress_nvmf_fuzz 00:08:40.799 04:25:19 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:40.799 04:25:19 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:40.799 04:25:19 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:40.799 04:25:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:40.799 04:25:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:40.799 04:25:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:40.799 04:25:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:40.799 04:25:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:40.799 04:25:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:40.799 04:25:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:40.799 04:25:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 16 00:08:40.799 04:25:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4416 00:08:40.799 04:25:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:41.059 04:25:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:41.059 04:25:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:41.059 04:25:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:41.059 04:25:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:41.059 04:25:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 00:08:41.059 [2024-11-17 04:25:19.665577] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:41.060 [2024-11-17 04:25:19.665665] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid155582 ] 00:08:41.060 [2024-11-17 04:25:19.871785] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:41.060 [2024-11-17 04:25:19.884393] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:41.320 [2024-11-17 04:25:19.936713] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:41.320 [2024-11-17 04:25:19.953014] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:41.320 INFO: Running with entropic power schedule (0xFF, 100). 00:08:41.320 INFO: Seed: 2899672725 00:08:41.320 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:41.320 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:41.320 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:41.320 INFO: A corpus is not provided, starting from an empty corpus 00:08:41.320 #2 INITED exec/s: 0 rss: 65Mb 00:08:41.320 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:41.320 This may also happen if the target rejected all inputs we tried so far 00:08:41.320 [2024-11-17 04:25:20.018420] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.320 [2024-11-17 04:25:20.018452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.320 [2024-11-17 04:25:20.018507] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.320 [2024-11-17 04:25:20.018523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.580 NEW_FUNC[1/716]: 0x4692c8 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:41.580 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:41.580 #6 NEW cov: 12290 ft: 12289 corp: 2/50b lim: 105 exec/s: 0 rss: 72Mb L: 49/49 MS: 4 ChangeBit-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:41.580 [2024-11-17 04:25:20.360170] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709355007 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.580 [2024-11-17 04:25:20.360214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.580 [2024-11-17 04:25:20.360344] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.580 [2024-11-17 04:25:20.360366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.580 #7 NEW cov: 12420 ft: 12961 corp: 3/99b lim: 105 exec/s: 0 rss: 72Mb L: 49/49 MS: 1 ChangeBinInt- 00:08:41.840 [2024-11-17 04:25:20.430050] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.840 [2024-11-17 04:25:20.430081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.840 #8 NEW cov: 12426 ft: 13650 corp: 4/128b lim: 105 exec/s: 0 rss: 72Mb L: 29/49 MS: 1 EraseBytes- 00:08:41.840 [2024-11-17 04:25:20.480380] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709355007 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.840 [2024-11-17 04:25:20.480411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.840 [2024-11-17 04:25:20.480547] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.840 [2024-11-17 04:25:20.480569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.840 #9 NEW cov: 12511 ft: 13865 corp: 5/177b lim: 105 exec/s: 0 rss: 72Mb L: 49/49 MS: 1 ShuffleBytes- 00:08:41.840 [2024-11-17 04:25:20.550637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.840 [2024-11-17 04:25:20.550672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.840 [2024-11-17 04:25:20.550811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.840 [2024-11-17 04:25:20.550837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.840 #10 NEW cov: 12511 ft: 13982 corp: 6/226b lim: 105 exec/s: 0 rss: 72Mb L: 49/49 MS: 1 ShuffleBytes- 00:08:41.840 [2024-11-17 04:25:20.600748] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.840 [2024-11-17 04:25:20.600779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.840 [2024-11-17 04:25:20.600927] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.840 [2024-11-17 04:25:20.600945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.840 #11 NEW cov: 12511 ft: 14041 corp: 7/282b lim: 105 exec/s: 0 rss: 72Mb L: 56/56 MS: 1 CopyPart- 00:08:42.100 [2024-11-17 04:25:20.671110] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.100 [2024-11-17 04:25:20.671149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.100 [2024-11-17 04:25:20.671273] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.100 [2024-11-17 04:25:20.671297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.100 [2024-11-17 04:25:20.671423] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.100 [2024-11-17 04:25:20.671449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.100 #12 NEW cov: 12511 ft: 14410 corp: 8/358b lim: 105 exec/s: 0 rss: 72Mb L: 76/76 MS: 1 CopyPart- 00:08:42.100 [2024-11-17 04:25:20.741149] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18445618173802512383 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.100 [2024-11-17 04:25:20.741175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.100 [2024-11-17 04:25:20.741303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.100 [2024-11-17 04:25:20.741322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.100 #13 NEW cov: 12511 ft: 14433 corp: 9/407b lim: 105 exec/s: 0 rss: 72Mb L: 49/76 MS: 1 ChangeBit- 00:08:42.101 [2024-11-17 04:25:20.791727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:15914838024376868060 len:56541 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.101 [2024-11-17 04:25:20.791759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.101 [2024-11-17 04:25:20.791861] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.101 [2024-11-17 04:25:20.791885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.101 [2024-11-17 04:25:20.792008] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.101 [2024-11-17 04:25:20.792029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.101 [2024-11-17 04:25:20.792146] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:15914838024376868060 len:56541 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.101 [2024-11-17 04:25:20.792170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.101 #14 NEW cov: 12511 ft: 14962 corp: 10/498b lim: 105 exec/s: 0 rss: 72Mb L: 91/91 MS: 1 InsertRepeatedBytes- 00:08:42.101 [2024-11-17 04:25:20.841403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.101 [2024-11-17 04:25:20.841430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.101 [2024-11-17 04:25:20.841562] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.101 [2024-11-17 04:25:20.841588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.101 #15 NEW cov: 12511 ft: 15132 corp: 11/554b lim: 105 exec/s: 0 rss: 72Mb L: 56/91 MS: 1 ChangeBit- 00:08:42.101 [2024-11-17 04:25:20.891437] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.101 [2024-11-17 04:25:20.891465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.101 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:42.101 #16 NEW cov: 12534 ft: 15173 corp: 12/579b lim: 105 exec/s: 0 rss: 72Mb L: 25/91 MS: 1 EraseBytes- 00:08:42.361 [2024-11-17 04:25:20.941679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.361 [2024-11-17 04:25:20.941724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.361 [2024-11-17 04:25:20.941843] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.361 [2024-11-17 04:25:20.941868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.361 #17 NEW cov: 12534 ft: 15271 corp: 13/636b lim: 105 exec/s: 0 rss: 72Mb L: 57/91 MS: 1 InsertByte- 00:08:42.361 [2024-11-17 04:25:21.011991] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709355007 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.361 [2024-11-17 04:25:21.012029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.361 [2024-11-17 04:25:21.012156] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.361 [2024-11-17 04:25:21.012182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.361 #18 NEW cov: 12534 ft: 15286 corp: 14/685b lim: 105 exec/s: 18 rss: 73Mb L: 49/91 MS: 1 ChangeBit- 00:08:42.361 [2024-11-17 04:25:21.081933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.361 [2024-11-17 04:25:21.081961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.361 #19 NEW cov: 12534 ft: 15307 corp: 15/710b lim: 105 exec/s: 19 rss: 73Mb L: 25/91 MS: 1 ChangeBit- 00:08:42.361 [2024-11-17 04:25:21.152159] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.361 [2024-11-17 04:25:21.152190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.621 #20 NEW cov: 12534 ft: 15335 corp: 16/736b lim: 105 exec/s: 20 rss: 73Mb L: 26/91 MS: 1 InsertByte- 00:08:42.621 [2024-11-17 04:25:21.222579] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.621 [2024-11-17 04:25:21.222609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.621 [2024-11-17 04:25:21.222732] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.621 [2024-11-17 04:25:21.222758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.621 #21 NEW cov: 12534 ft: 15350 corp: 17/782b lim: 105 exec/s: 21 rss: 73Mb L: 46/91 MS: 1 CopyPart- 00:08:42.621 [2024-11-17 04:25:21.272638] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.621 [2024-11-17 04:25:21.272677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.621 [2024-11-17 04:25:21.272807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.621 [2024-11-17 04:25:21.272831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.621 #22 NEW cov: 12534 ft: 15371 corp: 18/838b lim: 105 exec/s: 22 rss: 73Mb L: 56/91 MS: 1 ShuffleBytes- 00:08:42.621 [2024-11-17 04:25:21.322768] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.621 [2024-11-17 04:25:21.322803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.622 [2024-11-17 04:25:21.322924] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.622 [2024-11-17 04:25:21.322949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.622 #23 NEW cov: 12534 ft: 15409 corp: 19/894b lim: 105 exec/s: 23 rss: 73Mb L: 56/91 MS: 1 ChangeBit- 00:08:42.622 [2024-11-17 04:25:21.393118] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709355007 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.622 [2024-11-17 04:25:21.393157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.622 [2024-11-17 04:25:21.393277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9223372036854775807 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.622 [2024-11-17 04:25:21.393301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.622 #24 NEW cov: 12534 ft: 15422 corp: 20/943b lim: 105 exec/s: 24 rss: 73Mb L: 49/91 MS: 1 ChangeBit- 00:08:42.622 [2024-11-17 04:25:21.443146] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.622 [2024-11-17 04:25:21.443173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.622 [2024-11-17 04:25:21.443304] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2089670227099910143 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.622 [2024-11-17 04:25:21.443330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.880 #25 NEW cov: 12534 ft: 15458 corp: 21/990b lim: 105 exec/s: 25 rss: 73Mb L: 47/91 MS: 1 InsertByte- 00:08:42.880 [2024-11-17 04:25:21.513789] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:15914838024376868060 len:56541 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.880 [2024-11-17 04:25:21.513820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.880 [2024-11-17 04:25:21.513904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:15910334424749497564 len:56541 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.880 [2024-11-17 04:25:21.513924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.880 [2024-11-17 04:25:21.514049] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.880 [2024-11-17 04:25:21.514068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.880 [2024-11-17 04:25:21.514207] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:15914838024376868060 len:56541 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.880 [2024-11-17 04:25:21.514230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.880 #26 NEW cov: 12534 ft: 15506 corp: 22/1081b lim: 105 exec/s: 26 rss: 73Mb L: 91/91 MS: 1 ChangeBit- 00:08:42.881 [2024-11-17 04:25:21.584063] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709355007 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.881 [2024-11-17 04:25:21.584100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.881 [2024-11-17 04:25:21.584217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.881 [2024-11-17 04:25:21.584240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.881 [2024-11-17 04:25:21.584360] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.881 [2024-11-17 04:25:21.584385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.881 [2024-11-17 04:25:21.584513] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446708889337462783 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.881 [2024-11-17 04:25:21.584532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.881 #27 NEW cov: 12534 ft: 15526 corp: 23/1170b lim: 105 exec/s: 27 rss: 73Mb L: 89/91 MS: 1 InsertRepeatedBytes- 00:08:42.881 [2024-11-17 04:25:21.653776] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.881 [2024-11-17 04:25:21.653812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.881 [2024-11-17 04:25:21.653940] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.881 [2024-11-17 04:25:21.653965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.881 #28 NEW cov: 12534 ft: 15577 corp: 24/1219b lim: 105 exec/s: 28 rss: 73Mb L: 49/91 MS: 1 ChangeBit- 00:08:42.881 [2024-11-17 04:25:21.703990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709355007 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.881 [2024-11-17 04:25:21.704028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.881 [2024-11-17 04:25:21.704148] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.881 [2024-11-17 04:25:21.704170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.141 #29 NEW cov: 12534 ft: 15613 corp: 25/1269b lim: 105 exec/s: 29 rss: 73Mb L: 50/91 MS: 1 InsertByte- 00:08:43.141 [2024-11-17 04:25:21.754515] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:15914838024376868060 len:56541 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.141 [2024-11-17 04:25:21.754543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.141 [2024-11-17 04:25:21.754637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:15910334424749497564 len:56541 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.141 [2024-11-17 04:25:21.754657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.141 [2024-11-17 04:25:21.754782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.141 [2024-11-17 04:25:21.754805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.141 [2024-11-17 04:25:21.754929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:2656240721398127836 len:56541 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.141 [2024-11-17 04:25:21.754955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.141 #30 NEW cov: 12534 ft: 15630 corp: 26/1360b lim: 105 exec/s: 30 rss: 73Mb L: 91/91 MS: 1 ChangeBinInt- 00:08:43.141 [2024-11-17 04:25:21.824712] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:15914838024376868060 len:56541 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.141 [2024-11-17 04:25:21.824746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.141 [2024-11-17 04:25:21.824844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:15910334424749497564 len:56541 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.141 [2024-11-17 04:25:21.824861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.141 [2024-11-17 04:25:21.825004] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.141 [2024-11-17 04:25:21.825022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.141 [2024-11-17 04:25:21.825142] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:15914838024376868060 len:56541 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.141 [2024-11-17 04:25:21.825163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.141 #31 NEW cov: 12534 ft: 15639 corp: 27/1451b lim: 105 exec/s: 31 rss: 73Mb L: 91/91 MS: 1 ShuffleBytes- 00:08:43.141 [2024-11-17 04:25:21.874282] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.141 [2024-11-17 04:25:21.874320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.141 #32 NEW cov: 12534 ft: 15656 corp: 28/1492b lim: 105 exec/s: 32 rss: 73Mb L: 41/91 MS: 1 EraseBytes- 00:08:43.141 [2024-11-17 04:25:21.944818] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.141 [2024-11-17 04:25:21.944854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.141 [2024-11-17 04:25:21.944972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.141 [2024-11-17 04:25:21.944999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.141 [2024-11-17 04:25:21.945118] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446492285546790911 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.141 [2024-11-17 04:25:21.945140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.141 #33 NEW cov: 12534 ft: 15691 corp: 29/1566b lim: 105 exec/s: 33 rss: 73Mb L: 74/91 MS: 1 CrossOver- 00:08:43.401 [2024-11-17 04:25:21.994901] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.401 [2024-11-17 04:25:21.994938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.402 [2024-11-17 04:25:21.995036] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:43.402 [2024-11-17 04:25:21.995059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.402 #34 NEW cov: 12534 ft: 15716 corp: 30/1621b lim: 105 exec/s: 17 rss: 74Mb L: 55/91 MS: 1 EraseBytes- 00:08:43.402 #34 DONE cov: 12534 ft: 15716 corp: 30/1621b lim: 105 exec/s: 17 rss: 74Mb 00:08:43.402 Done 34 runs in 2 second(s) 00:08:43.402 04:25:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_16.conf /var/tmp/suppress_nvmf_fuzz 00:08:43.402 04:25:22 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:43.402 04:25:22 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:43.402 04:25:22 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:43.402 04:25:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:43.402 04:25:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:43.402 04:25:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:43.402 04:25:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:43.402 04:25:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:43.402 04:25:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:43.402 04:25:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:43.402 04:25:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 17 00:08:43.402 04:25:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4417 00:08:43.402 04:25:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:43.402 04:25:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:43.402 04:25:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:43.402 04:25:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:43.402 04:25:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:43.402 04:25:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 00:08:43.402 [2024-11-17 04:25:22.179400] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:43.402 [2024-11-17 04:25:22.179470] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid155966 ] 00:08:43.662 [2024-11-17 04:25:22.379464] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:43.662 [2024-11-17 04:25:22.391880] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.662 [2024-11-17 04:25:22.444513] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:43.662 [2024-11-17 04:25:22.460842] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:43.662 INFO: Running with entropic power schedule (0xFF, 100). 00:08:43.662 INFO: Seed: 1110708624 00:08:43.922 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:43.922 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:43.922 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:43.922 INFO: A corpus is not provided, starting from an empty corpus 00:08:43.922 #2 INITED exec/s: 0 rss: 65Mb 00:08:43.922 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:43.922 This may also happen if the target rejected all inputs we tried so far 00:08:43.922 [2024-11-17 04:25:22.526892] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.922 [2024-11-17 04:25:22.526931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.181 NEW_FUNC[1/717]: 0x46c648 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:44.182 NEW_FUNC[2/717]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:44.182 #33 NEW cov: 12329 ft: 12329 corp: 2/35b lim: 120 exec/s: 0 rss: 72Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:44.182 [2024-11-17 04:25:22.867975] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.182 [2024-11-17 04:25:22.868023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.182 [2024-11-17 04:25:22.868149] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.182 [2024-11-17 04:25:22.868177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.182 #34 NEW cov: 12442 ft: 13804 corp: 3/93b lim: 120 exec/s: 0 rss: 72Mb L: 58/58 MS: 1 CopyPart- 00:08:44.182 [2024-11-17 04:25:22.948482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.182 [2024-11-17 04:25:22.948519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.182 [2024-11-17 04:25:22.948623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.182 [2024-11-17 04:25:22.948645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.182 [2024-11-17 04:25:22.948758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.182 [2024-11-17 04:25:22.948778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.182 #35 NEW cov: 12448 ft: 14406 corp: 4/175b lim: 120 exec/s: 0 rss: 72Mb L: 82/82 MS: 1 InsertRepeatedBytes- 00:08:44.182 [2024-11-17 04:25:22.998532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.182 [2024-11-17 04:25:22.998568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.182 [2024-11-17 04:25:22.998672] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.182 [2024-11-17 04:25:22.998698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.182 [2024-11-17 04:25:22.998810] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.182 [2024-11-17 04:25:22.998833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.442 #36 NEW cov: 12533 ft: 14644 corp: 5/257b lim: 120 exec/s: 0 rss: 72Mb L: 82/82 MS: 1 ChangeBinInt- 00:08:44.442 [2024-11-17 04:25:23.068812] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.442 [2024-11-17 04:25:23.068845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.442 [2024-11-17 04:25:23.068932] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.442 [2024-11-17 04:25:23.068956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.442 [2024-11-17 04:25:23.069078] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.442 [2024-11-17 04:25:23.069096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.442 #37 NEW cov: 12533 ft: 14805 corp: 6/336b lim: 120 exec/s: 0 rss: 72Mb L: 79/82 MS: 1 EraseBytes- 00:08:44.442 [2024-11-17 04:25:23.118672] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.442 [2024-11-17 04:25:23.118713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.442 [2024-11-17 04:25:23.118841] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.442 [2024-11-17 04:25:23.118867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.442 #38 NEW cov: 12533 ft: 14881 corp: 7/394b lim: 120 exec/s: 0 rss: 72Mb L: 58/82 MS: 1 CopyPart- 00:08:44.442 [2024-11-17 04:25:23.189144] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.442 [2024-11-17 04:25:23.189179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.442 [2024-11-17 04:25:23.189292] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.442 [2024-11-17 04:25:23.189312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.442 [2024-11-17 04:25:23.189423] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.442 [2024-11-17 04:25:23.189448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.442 #39 NEW cov: 12533 ft: 14938 corp: 8/473b lim: 120 exec/s: 0 rss: 72Mb L: 79/82 MS: 1 CopyPart- 00:08:44.442 [2024-11-17 04:25:23.259301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.442 [2024-11-17 04:25:23.259335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.442 [2024-11-17 04:25:23.259434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.442 [2024-11-17 04:25:23.259457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.442 [2024-11-17 04:25:23.259570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.442 [2024-11-17 04:25:23.259593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.702 #40 NEW cov: 12533 ft: 14980 corp: 9/556b lim: 120 exec/s: 0 rss: 73Mb L: 83/83 MS: 1 InsertByte- 00:08:44.702 [2024-11-17 04:25:23.329205] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.702 [2024-11-17 04:25:23.329239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.702 [2024-11-17 04:25:23.329359] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.702 [2024-11-17 04:25:23.329382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.702 #41 NEW cov: 12533 ft: 15030 corp: 10/615b lim: 120 exec/s: 0 rss: 73Mb L: 59/83 MS: 1 InsertByte- 00:08:44.702 [2024-11-17 04:25:23.399451] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.702 [2024-11-17 04:25:23.399486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.702 [2024-11-17 04:25:23.399595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.702 [2024-11-17 04:25:23.399617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.702 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:44.702 #42 NEW cov: 12556 ft: 15097 corp: 11/674b lim: 120 exec/s: 0 rss: 73Mb L: 59/83 MS: 1 ChangeByte- 00:08:44.702 [2024-11-17 04:25:23.469966] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.702 [2024-11-17 04:25:23.470002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.702 [2024-11-17 04:25:23.470110] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709544959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.702 [2024-11-17 04:25:23.470131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.702 [2024-11-17 04:25:23.470247] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.702 [2024-11-17 04:25:23.470269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.702 #43 NEW cov: 12556 ft: 15115 corp: 12/756b lim: 120 exec/s: 0 rss: 73Mb L: 82/83 MS: 1 ChangeByte- 00:08:44.702 [2024-11-17 04:25:23.520269] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.702 [2024-11-17 04:25:23.520300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.702 [2024-11-17 04:25:23.520409] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.702 [2024-11-17 04:25:23.520431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.702 [2024-11-17 04:25:23.520552] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.702 [2024-11-17 04:25:23.520575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.702 [2024-11-17 04:25:23.520691] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:10634005409009210259 len:37780 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.702 [2024-11-17 04:25:23.520716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.962 #44 NEW cov: 12556 ft: 15473 corp: 13/854b lim: 120 exec/s: 44 rss: 73Mb L: 98/98 MS: 1 InsertRepeatedBytes- 00:08:44.962 [2024-11-17 04:25:23.570225] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.962 [2024-11-17 04:25:23.570258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.962 [2024-11-17 04:25:23.570344] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.962 [2024-11-17 04:25:23.570365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.962 [2024-11-17 04:25:23.570479] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.962 [2024-11-17 04:25:23.570501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.962 #45 NEW cov: 12556 ft: 15493 corp: 14/944b lim: 120 exec/s: 45 rss: 73Mb L: 90/98 MS: 1 CopyPart- 00:08:44.962 [2024-11-17 04:25:23.610065] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551412 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.962 [2024-11-17 04:25:23.610102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.962 [2024-11-17 04:25:23.610214] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.962 [2024-11-17 04:25:23.610239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.962 #46 NEW cov: 12556 ft: 15623 corp: 15/1003b lim: 120 exec/s: 46 rss: 73Mb L: 59/98 MS: 1 InsertByte- 00:08:44.962 [2024-11-17 04:25:23.660194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.962 [2024-11-17 04:25:23.660226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.962 [2024-11-17 04:25:23.660352] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65390 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.962 [2024-11-17 04:25:23.660375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.962 #52 NEW cov: 12556 ft: 15658 corp: 16/1052b lim: 120 exec/s: 52 rss: 73Mb L: 49/98 MS: 1 EraseBytes- 00:08:44.963 [2024-11-17 04:25:23.730781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.963 [2024-11-17 04:25:23.730817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.963 [2024-11-17 04:25:23.730937] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709544959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.963 [2024-11-17 04:25:23.730960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.963 [2024-11-17 04:25:23.731084] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.963 [2024-11-17 04:25:23.731108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.963 #53 NEW cov: 12556 ft: 15704 corp: 17/1134b lim: 120 exec/s: 53 rss: 73Mb L: 82/98 MS: 1 CrossOver- 00:08:45.223 [2024-11-17 04:25:23.801079] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.223 [2024-11-17 04:25:23.801114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.223 [2024-11-17 04:25:23.801241] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709544959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.223 [2024-11-17 04:25:23.801265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.223 [2024-11-17 04:25:23.801383] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.223 [2024-11-17 04:25:23.801406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.223 #54 NEW cov: 12556 ft: 15711 corp: 18/1216b lim: 120 exec/s: 54 rss: 73Mb L: 82/98 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:45.223 [2024-11-17 04:25:23.850752] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.223 [2024-11-17 04:25:23.850783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.223 [2024-11-17 04:25:23.850896] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446462603027808255 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.223 [2024-11-17 04:25:23.850921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.223 #55 NEW cov: 12556 ft: 15743 corp: 19/1275b lim: 120 exec/s: 55 rss: 73Mb L: 59/98 MS: 1 ChangeBinInt- 00:08:45.223 [2024-11-17 04:25:23.900633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.223 [2024-11-17 04:25:23.900665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.223 #56 NEW cov: 12559 ft: 15878 corp: 20/1314b lim: 120 exec/s: 56 rss: 73Mb L: 39/98 MS: 1 EraseBytes- 00:08:45.223 [2024-11-17 04:25:23.971421] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.223 [2024-11-17 04:25:23.971451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.223 [2024-11-17 04:25:23.971582] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.223 [2024-11-17 04:25:23.971608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.223 [2024-11-17 04:25:23.971729] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.223 [2024-11-17 04:25:23.971752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.223 #57 NEW cov: 12559 ft: 15901 corp: 21/1408b lim: 120 exec/s: 57 rss: 73Mb L: 94/98 MS: 1 CopyPart- 00:08:45.223 [2024-11-17 04:25:24.041326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.223 [2024-11-17 04:25:24.041363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.223 [2024-11-17 04:25:24.041473] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446743137406681087 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.223 [2024-11-17 04:25:24.041495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.484 #58 NEW cov: 12559 ft: 15952 corp: 22/1467b lim: 120 exec/s: 58 rss: 73Mb L: 59/98 MS: 1 ChangeByte- 00:08:45.484 [2024-11-17 04:25:24.091786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.484 [2024-11-17 04:25:24.091822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.484 [2024-11-17 04:25:24.091929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.484 [2024-11-17 04:25:24.091956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.484 [2024-11-17 04:25:24.092078] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.484 [2024-11-17 04:25:24.092104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.484 #59 NEW cov: 12559 ft: 15968 corp: 23/1549b lim: 120 exec/s: 59 rss: 73Mb L: 82/98 MS: 1 CrossOver- 00:08:45.484 [2024-11-17 04:25:24.141472] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.484 [2024-11-17 04:25:24.141500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.484 #62 NEW cov: 12559 ft: 16024 corp: 24/1580b lim: 120 exec/s: 62 rss: 73Mb L: 31/98 MS: 3 ChangeBit-InsertByte-CrossOver- 00:08:45.484 [2024-11-17 04:25:24.192173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.484 [2024-11-17 04:25:24.192207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.484 [2024-11-17 04:25:24.192326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709544959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.484 [2024-11-17 04:25:24.192349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.484 [2024-11-17 04:25:24.192466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.484 [2024-11-17 04:25:24.192489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.484 #68 NEW cov: 12559 ft: 16034 corp: 25/1666b lim: 120 exec/s: 68 rss: 73Mb L: 86/98 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:45.484 [2024-11-17 04:25:24.262129] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.484 [2024-11-17 04:25:24.262159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.484 [2024-11-17 04:25:24.262284] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446462603027808255 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.484 [2024-11-17 04:25:24.262306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.484 #69 NEW cov: 12559 ft: 16083 corp: 26/1725b lim: 120 exec/s: 69 rss: 74Mb L: 59/98 MS: 1 ChangeBinInt- 00:08:45.744 [2024-11-17 04:25:24.332628] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.744 [2024-11-17 04:25:24.332664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.744 [2024-11-17 04:25:24.332787] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.744 [2024-11-17 04:25:24.332812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.744 [2024-11-17 04:25:24.332936] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.744 [2024-11-17 04:25:24.332963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.744 #70 NEW cov: 12559 ft: 16089 corp: 27/1808b lim: 120 exec/s: 70 rss: 74Mb L: 83/98 MS: 1 ShuffleBytes- 00:08:45.744 [2024-11-17 04:25:24.402586] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.744 [2024-11-17 04:25:24.402620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.744 [2024-11-17 04:25:24.402740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65390 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.744 [2024-11-17 04:25:24.402766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.744 #71 NEW cov: 12559 ft: 16123 corp: 28/1857b lim: 120 exec/s: 71 rss: 74Mb L: 49/98 MS: 1 ChangeByte- 00:08:45.744 [2024-11-17 04:25:24.472848] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.745 [2024-11-17 04:25:24.472881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.745 [2024-11-17 04:25:24.473004] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65390 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.745 [2024-11-17 04:25:24.473025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.745 #72 NEW cov: 12559 ft: 16128 corp: 29/1906b lim: 120 exec/s: 36 rss: 74Mb L: 49/98 MS: 1 CrossOver- 00:08:45.745 #72 DONE cov: 12559 ft: 16128 corp: 29/1906b lim: 120 exec/s: 36 rss: 74Mb 00:08:45.745 ###### Recommended dictionary. ###### 00:08:45.745 "\377\377\377\377" # Uses: 1 00:08:45.745 ###### End of recommended dictionary. ###### 00:08:45.745 Done 72 runs in 2 second(s) 00:08:46.005 04:25:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_17.conf /var/tmp/suppress_nvmf_fuzz 00:08:46.005 04:25:24 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:46.005 04:25:24 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:46.005 04:25:24 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:46.005 04:25:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:46.005 04:25:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:46.005 04:25:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:46.005 04:25:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:46.005 04:25:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:46.005 04:25:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:46.005 04:25:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:46.005 04:25:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 18 00:08:46.005 04:25:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4418 00:08:46.005 04:25:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:46.005 04:25:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:46.005 04:25:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:46.005 04:25:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:46.005 04:25:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:46.005 04:25:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 00:08:46.005 [2024-11-17 04:25:24.661592] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:46.005 [2024-11-17 04:25:24.661669] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid156495 ] 00:08:46.265 [2024-11-17 04:25:24.864538] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:46.265 [2024-11-17 04:25:24.877412] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:46.265 [2024-11-17 04:25:24.929923] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:46.265 [2024-11-17 04:25:24.946211] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:46.265 INFO: Running with entropic power schedule (0xFF, 100). 00:08:46.265 INFO: Seed: 3597704337 00:08:46.265 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:46.265 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:46.265 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:46.265 INFO: A corpus is not provided, starting from an empty corpus 00:08:46.265 #2 INITED exec/s: 0 rss: 65Mb 00:08:46.265 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:46.266 This may also happen if the target rejected all inputs we tried so far 00:08:46.266 [2024-11-17 04:25:25.011622] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.266 [2024-11-17 04:25:25.011651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.266 [2024-11-17 04:25:25.011710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:46.266 [2024-11-17 04:25:25.011742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.526 NEW_FUNC[1/715]: 0x46ff38 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:46.526 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:46.526 #22 NEW cov: 12254 ft: 12255 corp: 2/57b lim: 100 exec/s: 0 rss: 72Mb L: 56/56 MS: 5 CopyPart-ChangeBit-InsertByte-CrossOver-InsertRepeatedBytes- 00:08:46.526 [2024-11-17 04:25:25.342795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.526 [2024-11-17 04:25:25.342863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.526 [2024-11-17 04:25:25.342954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:46.526 [2024-11-17 04:25:25.342986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.526 [2024-11-17 04:25:25.343071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:46.526 [2024-11-17 04:25:25.343101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.786 #23 NEW cov: 12384 ft: 13047 corp: 3/124b lim: 100 exec/s: 0 rss: 72Mb L: 67/67 MS: 1 InsertRepeatedBytes- 00:08:46.786 [2024-11-17 04:25:25.412530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.786 [2024-11-17 04:25:25.412556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.786 [2024-11-17 04:25:25.412614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:46.786 [2024-11-17 04:25:25.412628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.786 #24 NEW cov: 12390 ft: 13356 corp: 4/180b lim: 100 exec/s: 0 rss: 72Mb L: 56/67 MS: 1 ChangeBinInt- 00:08:46.786 [2024-11-17 04:25:25.452707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.786 [2024-11-17 04:25:25.452735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.786 [2024-11-17 04:25:25.452781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:46.786 [2024-11-17 04:25:25.452796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.786 [2024-11-17 04:25:25.452847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:46.786 [2024-11-17 04:25:25.452862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.786 #25 NEW cov: 12475 ft: 13581 corp: 5/247b lim: 100 exec/s: 0 rss: 72Mb L: 67/67 MS: 1 ShuffleBytes- 00:08:46.786 [2024-11-17 04:25:25.512656] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.786 [2024-11-17 04:25:25.512682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.786 #29 NEW cov: 12475 ft: 14119 corp: 6/278b lim: 100 exec/s: 0 rss: 72Mb L: 31/67 MS: 4 CopyPart-CrossOver-CrossOver-InsertRepeatedBytes- 00:08:46.786 [2024-11-17 04:25:25.552906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.786 [2024-11-17 04:25:25.552932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.786 [2024-11-17 04:25:25.552989] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:46.786 [2024-11-17 04:25:25.553005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.786 #30 NEW cov: 12475 ft: 14252 corp: 7/334b lim: 100 exec/s: 0 rss: 72Mb L: 56/67 MS: 1 ChangeBinInt- 00:08:46.786 [2024-11-17 04:25:25.613058] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.786 [2024-11-17 04:25:25.613084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.786 [2024-11-17 04:25:25.613123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:46.786 [2024-11-17 04:25:25.613138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.046 #31 NEW cov: 12475 ft: 14327 corp: 8/391b lim: 100 exec/s: 0 rss: 72Mb L: 57/67 MS: 1 InsertByte- 00:08:47.046 [2024-11-17 04:25:25.673240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.046 [2024-11-17 04:25:25.673266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.046 [2024-11-17 04:25:25.673308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.046 [2024-11-17 04:25:25.673323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.046 #32 NEW cov: 12475 ft: 14391 corp: 9/448b lim: 100 exec/s: 0 rss: 73Mb L: 57/67 MS: 1 ShuffleBytes- 00:08:47.046 [2024-11-17 04:25:25.733412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.046 [2024-11-17 04:25:25.733437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.046 [2024-11-17 04:25:25.733488] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.046 [2024-11-17 04:25:25.733503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.046 #33 NEW cov: 12475 ft: 14405 corp: 10/491b lim: 100 exec/s: 0 rss: 73Mb L: 43/67 MS: 1 CopyPart- 00:08:47.046 [2024-11-17 04:25:25.793584] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.046 [2024-11-17 04:25:25.793611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.046 [2024-11-17 04:25:25.793649] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.046 [2024-11-17 04:25:25.793663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.046 #34 NEW cov: 12475 ft: 14464 corp: 11/548b lim: 100 exec/s: 0 rss: 73Mb L: 57/67 MS: 1 ChangeBit- 00:08:47.046 [2024-11-17 04:25:25.853612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.046 [2024-11-17 04:25:25.853639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.307 #35 NEW cov: 12475 ft: 14546 corp: 12/580b lim: 100 exec/s: 0 rss: 73Mb L: 32/67 MS: 1 InsertByte- 00:08:47.307 [2024-11-17 04:25:25.893982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.307 [2024-11-17 04:25:25.894008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.307 [2024-11-17 04:25:25.894045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.307 [2024-11-17 04:25:25.894060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.307 [2024-11-17 04:25:25.894115] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:47.307 [2024-11-17 04:25:25.894129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.307 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:47.307 #36 NEW cov: 12498 ft: 14634 corp: 13/655b lim: 100 exec/s: 0 rss: 73Mb L: 75/75 MS: 1 CMP- DE: "\377\211v\237\355\330\240\212"- 00:08:47.307 [2024-11-17 04:25:25.934090] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.307 [2024-11-17 04:25:25.934114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.307 [2024-11-17 04:25:25.934147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.307 [2024-11-17 04:25:25.934161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.307 [2024-11-17 04:25:25.934229] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:47.307 [2024-11-17 04:25:25.934243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.307 #37 NEW cov: 12498 ft: 14657 corp: 14/722b lim: 100 exec/s: 0 rss: 73Mb L: 67/75 MS: 1 PersAutoDict- DE: "\377\211v\237\355\330\240\212"- 00:08:47.307 [2024-11-17 04:25:25.974180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.307 [2024-11-17 04:25:25.974206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.307 [2024-11-17 04:25:25.974268] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.307 [2024-11-17 04:25:25.974283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.307 [2024-11-17 04:25:25.974336] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:47.307 [2024-11-17 04:25:25.974351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.307 #38 NEW cov: 12498 ft: 14753 corp: 15/789b lim: 100 exec/s: 38 rss: 73Mb L: 67/75 MS: 1 ChangeBit- 00:08:47.307 [2024-11-17 04:25:26.014265] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.307 [2024-11-17 04:25:26.014290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.307 [2024-11-17 04:25:26.014338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.307 [2024-11-17 04:25:26.014353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.307 [2024-11-17 04:25:26.014405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:47.307 [2024-11-17 04:25:26.014421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.307 #39 NEW cov: 12498 ft: 14804 corp: 16/856b lim: 100 exec/s: 39 rss: 73Mb L: 67/75 MS: 1 ChangeBinInt- 00:08:47.307 [2024-11-17 04:25:26.074430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.307 [2024-11-17 04:25:26.074456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.307 [2024-11-17 04:25:26.074504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.307 [2024-11-17 04:25:26.074518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.307 [2024-11-17 04:25:26.074570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:47.307 [2024-11-17 04:25:26.074588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.307 #40 NEW cov: 12498 ft: 14828 corp: 17/931b lim: 100 exec/s: 40 rss: 73Mb L: 75/75 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\001"- 00:08:47.307 [2024-11-17 04:25:26.114524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.307 [2024-11-17 04:25:26.114549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.307 [2024-11-17 04:25:26.114605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.307 [2024-11-17 04:25:26.114620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.307 [2024-11-17 04:25:26.114672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:47.307 [2024-11-17 04:25:26.114687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.567 #41 NEW cov: 12498 ft: 14858 corp: 18/992b lim: 100 exec/s: 41 rss: 73Mb L: 61/75 MS: 1 CMP- DE: "\001\000\002\000"- 00:08:47.567 [2024-11-17 04:25:26.154593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.567 [2024-11-17 04:25:26.154619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.567 [2024-11-17 04:25:26.154666] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.567 [2024-11-17 04:25:26.154681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.567 #42 NEW cov: 12498 ft: 14898 corp: 19/1041b lim: 100 exec/s: 42 rss: 73Mb L: 49/75 MS: 1 EraseBytes- 00:08:47.567 [2024-11-17 04:25:26.214865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.567 [2024-11-17 04:25:26.214890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.567 [2024-11-17 04:25:26.214928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.567 [2024-11-17 04:25:26.214942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.567 [2024-11-17 04:25:26.214994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:47.567 [2024-11-17 04:25:26.215008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.567 #48 NEW cov: 12498 ft: 14904 corp: 20/1109b lim: 100 exec/s: 48 rss: 73Mb L: 68/75 MS: 1 InsertRepeatedBytes- 00:08:47.567 [2024-11-17 04:25:26.255048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.567 [2024-11-17 04:25:26.255073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.567 [2024-11-17 04:25:26.255127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.568 [2024-11-17 04:25:26.255141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.568 [2024-11-17 04:25:26.255191] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:47.568 [2024-11-17 04:25:26.255205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.568 [2024-11-17 04:25:26.255255] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:47.568 [2024-11-17 04:25:26.255269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.568 #49 NEW cov: 12498 ft: 15175 corp: 21/1200b lim: 100 exec/s: 49 rss: 73Mb L: 91/91 MS: 1 InsertRepeatedBytes- 00:08:47.568 [2024-11-17 04:25:26.315004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.568 [2024-11-17 04:25:26.315028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.568 [2024-11-17 04:25:26.315081] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.568 [2024-11-17 04:25:26.315095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.568 #50 NEW cov: 12498 ft: 15185 corp: 22/1256b lim: 100 exec/s: 50 rss: 73Mb L: 56/91 MS: 1 ChangeByte- 00:08:47.568 [2024-11-17 04:25:26.355384] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.568 [2024-11-17 04:25:26.355408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.568 [2024-11-17 04:25:26.355477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.568 [2024-11-17 04:25:26.355490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.568 [2024-11-17 04:25:26.355543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:47.568 [2024-11-17 04:25:26.355557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.568 [2024-11-17 04:25:26.355609] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:47.568 [2024-11-17 04:25:26.355623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.828 #51 NEW cov: 12498 ft: 15199 corp: 23/1347b lim: 100 exec/s: 51 rss: 73Mb L: 91/91 MS: 1 InsertRepeatedBytes- 00:08:47.828 [2024-11-17 04:25:26.415308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.828 [2024-11-17 04:25:26.415333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.828 [2024-11-17 04:25:26.415378] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.828 [2024-11-17 04:25:26.415393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.828 #52 NEW cov: 12498 ft: 15210 corp: 24/1396b lim: 100 exec/s: 52 rss: 73Mb L: 49/91 MS: 1 ShuffleBytes- 00:08:47.828 [2024-11-17 04:25:26.475566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.828 [2024-11-17 04:25:26.475591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.828 [2024-11-17 04:25:26.475639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.828 [2024-11-17 04:25:26.475654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.828 [2024-11-17 04:25:26.475708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:47.828 [2024-11-17 04:25:26.475722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.828 #53 NEW cov: 12498 ft: 15217 corp: 25/1473b lim: 100 exec/s: 53 rss: 74Mb L: 77/91 MS: 1 InsertRepeatedBytes- 00:08:47.828 [2024-11-17 04:25:26.535887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.828 [2024-11-17 04:25:26.535911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.828 [2024-11-17 04:25:26.535976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.828 [2024-11-17 04:25:26.535991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.828 [2024-11-17 04:25:26.536044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:47.828 [2024-11-17 04:25:26.536057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.828 [2024-11-17 04:25:26.536111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:47.828 [2024-11-17 04:25:26.536126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.828 #54 NEW cov: 12498 ft: 15254 corp: 26/1555b lim: 100 exec/s: 54 rss: 74Mb L: 82/91 MS: 1 CrossOver- 00:08:47.828 [2024-11-17 04:25:26.575777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.828 [2024-11-17 04:25:26.575801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.828 [2024-11-17 04:25:26.575854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.828 [2024-11-17 04:25:26.575869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.828 #55 NEW cov: 12498 ft: 15319 corp: 27/1598b lim: 100 exec/s: 55 rss: 74Mb L: 43/91 MS: 1 ChangeByte- 00:08:47.828 [2024-11-17 04:25:26.636085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.828 [2024-11-17 04:25:26.636109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.828 [2024-11-17 04:25:26.636146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.828 [2024-11-17 04:25:26.636160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.828 #56 NEW cov: 12498 ft: 15335 corp: 28/1655b lim: 100 exec/s: 56 rss: 74Mb L: 57/91 MS: 1 ChangeBinInt- 00:08:48.088 [2024-11-17 04:25:26.676146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:48.088 [2024-11-17 04:25:26.676170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.088 [2024-11-17 04:25:26.676225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:48.088 [2024-11-17 04:25:26.676240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.088 #57 NEW cov: 12498 ft: 15341 corp: 29/1712b lim: 100 exec/s: 57 rss: 74Mb L: 57/91 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\001"- 00:08:48.088 [2024-11-17 04:25:26.736363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:48.088 [2024-11-17 04:25:26.736388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.088 [2024-11-17 04:25:26.736441] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:48.088 [2024-11-17 04:25:26.736456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.088 #58 NEW cov: 12498 ft: 15349 corp: 30/1769b lim: 100 exec/s: 58 rss: 74Mb L: 57/91 MS: 1 ShuffleBytes- 00:08:48.088 [2024-11-17 04:25:26.776453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:48.088 [2024-11-17 04:25:26.776477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.088 [2024-11-17 04:25:26.776531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:48.088 [2024-11-17 04:25:26.776548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.088 #59 NEW cov: 12498 ft: 15359 corp: 31/1826b lim: 100 exec/s: 59 rss: 74Mb L: 57/91 MS: 1 InsertByte- 00:08:48.088 [2024-11-17 04:25:26.816533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:48.088 [2024-11-17 04:25:26.816558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.088 [2024-11-17 04:25:26.816613] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:48.088 [2024-11-17 04:25:26.816628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.088 #60 NEW cov: 12498 ft: 15400 corp: 32/1883b lim: 100 exec/s: 60 rss: 74Mb L: 57/91 MS: 1 ShuffleBytes- 00:08:48.088 [2024-11-17 04:25:26.856665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:48.088 [2024-11-17 04:25:26.856689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.088 [2024-11-17 04:25:26.856747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:48.088 [2024-11-17 04:25:26.856762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.088 #61 NEW cov: 12498 ft: 15454 corp: 33/1940b lim: 100 exec/s: 61 rss: 74Mb L: 57/91 MS: 1 CMP- DE: "\377\001"- 00:08:48.088 [2024-11-17 04:25:26.917128] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:48.088 [2024-11-17 04:25:26.917155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.088 [2024-11-17 04:25:26.917207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:48.088 [2024-11-17 04:25:26.917219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.088 [2024-11-17 04:25:26.917269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:48.088 [2024-11-17 04:25:26.917283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.088 [2024-11-17 04:25:26.917335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:48.088 [2024-11-17 04:25:26.917349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.349 #62 NEW cov: 12498 ft: 15476 corp: 34/2031b lim: 100 exec/s: 62 rss: 74Mb L: 91/91 MS: 1 CopyPart- 00:08:48.349 [2024-11-17 04:25:26.977128] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:48.349 [2024-11-17 04:25:26.977154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.349 [2024-11-17 04:25:26.977218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:48.349 [2024-11-17 04:25:26.977233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.349 [2024-11-17 04:25:26.977286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:48.349 [2024-11-17 04:25:26.977300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.349 #63 NEW cov: 12498 ft: 15513 corp: 35/2103b lim: 100 exec/s: 31 rss: 74Mb L: 72/91 MS: 1 InsertRepeatedBytes- 00:08:48.349 #63 DONE cov: 12498 ft: 15513 corp: 35/2103b lim: 100 exec/s: 31 rss: 74Mb 00:08:48.349 ###### Recommended dictionary. ###### 00:08:48.349 "\377\211v\237\355\330\240\212" # Uses: 1 00:08:48.349 "\001\000\000\000\000\000\000\001" # Uses: 1 00:08:48.349 "\001\000\002\000" # Uses: 0 00:08:48.349 "\377\001" # Uses: 0 00:08:48.349 ###### End of recommended dictionary. ###### 00:08:48.349 Done 63 runs in 2 second(s) 00:08:48.349 04:25:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_18.conf /var/tmp/suppress_nvmf_fuzz 00:08:48.349 04:25:27 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:48.349 04:25:27 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:48.349 04:25:27 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:48.349 04:25:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:48.349 04:25:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:48.349 04:25:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:48.349 04:25:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:48.349 04:25:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:48.349 04:25:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:48.349 04:25:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:48.349 04:25:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 19 00:08:48.349 04:25:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4419 00:08:48.349 04:25:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:48.349 04:25:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:48.349 04:25:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:48.349 04:25:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:48.349 04:25:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:48.349 04:25:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 00:08:48.349 [2024-11-17 04:25:27.141192] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:48.349 [2024-11-17 04:25:27.141270] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid156801 ] 00:08:48.609 [2024-11-17 04:25:27.343201] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:48.609 [2024-11-17 04:25:27.356751] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:48.609 [2024-11-17 04:25:27.409291] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:48.609 [2024-11-17 04:25:27.425624] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:48.869 INFO: Running with entropic power schedule (0xFF, 100). 00:08:48.869 INFO: Seed: 1781747030 00:08:48.869 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:48.869 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:48.869 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:48.869 INFO: A corpus is not provided, starting from an empty corpus 00:08:48.869 #2 INITED exec/s: 0 rss: 66Mb 00:08:48.869 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:48.870 This may also happen if the target rejected all inputs we tried so far 00:08:48.870 [2024-11-17 04:25:27.502307] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:48.870 [2024-11-17 04:25:27.502351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.870 [2024-11-17 04:25:27.502427] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:48.870 [2024-11-17 04:25:27.502452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.870 [2024-11-17 04:25:27.502548] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:48.870 [2024-11-17 04:25:27.502576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.870 [2024-11-17 04:25:27.502689] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:48.870 [2024-11-17 04:25:27.502714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.130 NEW_FUNC[1/715]: 0x472ef8 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:49.130 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:49.130 #7 NEW cov: 12232 ft: 12233 corp: 2/42b lim: 50 exec/s: 0 rss: 72Mb L: 41/41 MS: 5 ChangeBit-ChangeBit-ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:49.130 [2024-11-17 04:25:27.832507] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:49.130 [2024-11-17 04:25:27.832549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.130 [2024-11-17 04:25:27.832637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:49.130 [2024-11-17 04:25:27.832659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.130 #8 NEW cov: 12362 ft: 13234 corp: 3/71b lim: 50 exec/s: 0 rss: 72Mb L: 29/41 MS: 1 EraseBytes- 00:08:49.130 [2024-11-17 04:25:27.902510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3621713194 len:62721 00:08:49.130 [2024-11-17 04:25:27.902538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.130 #12 NEW cov: 12368 ft: 13774 corp: 4/86b lim: 50 exec/s: 0 rss: 72Mb L: 15/41 MS: 4 InsertByte-CopyPart-ChangeBinInt-CrossOver- 00:08:49.130 [2024-11-17 04:25:27.952731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:49.130 [2024-11-17 04:25:27.952762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.130 [2024-11-17 04:25:27.952807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7 len:1 00:08:49.130 [2024-11-17 04:25:27.952830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.390 #13 NEW cov: 12453 ft: 14068 corp: 5/115b lim: 50 exec/s: 0 rss: 72Mb L: 29/41 MS: 1 ChangeBinInt- 00:08:49.390 [2024-11-17 04:25:28.022908] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:63232 00:08:49.390 [2024-11-17 04:25:28.022938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.390 [2024-11-17 04:25:28.022982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4294967046 len:1 00:08:49.390 [2024-11-17 04:25:28.023000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.390 #14 NEW cov: 12453 ft: 14214 corp: 6/144b lim: 50 exec/s: 0 rss: 72Mb L: 29/41 MS: 1 ChangeBinInt- 00:08:49.390 [2024-11-17 04:25:28.083512] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:49.390 [2024-11-17 04:25:28.083543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.390 [2024-11-17 04:25:28.083601] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:239526031130624 len:55770 00:08:49.390 [2024-11-17 04:25:28.083618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.390 [2024-11-17 04:25:28.083711] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:49.390 [2024-11-17 04:25:28.083744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.390 [2024-11-17 04:25:28.083846] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:49.390 [2024-11-17 04:25:28.083869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.390 #15 NEW cov: 12453 ft: 14343 corp: 7/189b lim: 50 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:08:49.390 [2024-11-17 04:25:28.133730] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:49.390 [2024-11-17 04:25:28.133761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.390 [2024-11-17 04:25:28.133824] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:49.390 [2024-11-17 04:25:28.133846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.390 [2024-11-17 04:25:28.133943] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:49.390 [2024-11-17 04:25:28.133966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.390 [2024-11-17 04:25:28.134067] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:49.390 [2024-11-17 04:25:28.134091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.390 #16 NEW cov: 12453 ft: 14399 corp: 8/230b lim: 50 exec/s: 0 rss: 72Mb L: 41/45 MS: 1 ShuffleBytes- 00:08:49.390 [2024-11-17 04:25:28.183307] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3621713194 len:62721 00:08:49.390 [2024-11-17 04:25:28.183333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.651 #17 NEW cov: 12453 ft: 14428 corp: 9/245b lim: 50 exec/s: 0 rss: 73Mb L: 15/45 MS: 1 ShuffleBytes- 00:08:49.651 [2024-11-17 04:25:28.254079] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:49.651 [2024-11-17 04:25:28.254111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.651 [2024-11-17 04:25:28.254190] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5787213825698693120 len:20561 00:08:49.651 [2024-11-17 04:25:28.254211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.651 [2024-11-17 04:25:28.254317] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:1347420160 len:1 00:08:49.651 [2024-11-17 04:25:28.254340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.651 [2024-11-17 04:25:28.254455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:49.651 [2024-11-17 04:25:28.254479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.651 #18 NEW cov: 12453 ft: 14459 corp: 10/294b lim: 50 exec/s: 0 rss: 73Mb L: 49/49 MS: 1 InsertRepeatedBytes- 00:08:49.651 [2024-11-17 04:25:28.304198] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:49.651 [2024-11-17 04:25:28.304229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.651 [2024-11-17 04:25:28.304303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:239526031130624 len:55770 00:08:49.651 [2024-11-17 04:25:28.304323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.651 [2024-11-17 04:25:28.304429] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:218 00:08:49.651 [2024-11-17 04:25:28.304453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.651 [2024-11-17 04:25:28.304560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3640655872 len:1 00:08:49.651 [2024-11-17 04:25:28.304583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.651 #19 NEW cov: 12453 ft: 14502 corp: 11/339b lim: 50 exec/s: 0 rss: 73Mb L: 45/49 MS: 1 CopyPart- 00:08:49.651 [2024-11-17 04:25:28.374397] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:43049161850880 len:10024 00:08:49.651 [2024-11-17 04:25:28.374429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.651 [2024-11-17 04:25:28.374507] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:49.651 [2024-11-17 04:25:28.374524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.651 [2024-11-17 04:25:28.374613] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:49.651 [2024-11-17 04:25:28.374637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.651 [2024-11-17 04:25:28.374759] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:49.651 [2024-11-17 04:25:28.374779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.651 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:49.651 #20 NEW cov: 12476 ft: 14538 corp: 12/384b lim: 50 exec/s: 0 rss: 73Mb L: 45/49 MS: 1 InsertRepeatedBytes- 00:08:49.651 [2024-11-17 04:25:28.424742] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:49.651 [2024-11-17 04:25:28.424771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.651 [2024-11-17 04:25:28.424848] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5787213825698693120 len:20561 00:08:49.651 [2024-11-17 04:25:28.424863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.651 [2024-11-17 04:25:28.424962] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:1347420160 len:1 00:08:49.651 [2024-11-17 04:25:28.424983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.651 [2024-11-17 04:25:28.425100] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:49.651 [2024-11-17 04:25:28.425123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.651 [2024-11-17 04:25:28.425234] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:0 len:1 00:08:49.651 [2024-11-17 04:25:28.425253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:49.651 #21 NEW cov: 12476 ft: 14599 corp: 13/434b lim: 50 exec/s: 0 rss: 73Mb L: 50/50 MS: 1 CopyPart- 00:08:49.911 [2024-11-17 04:25:28.494814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:49.911 [2024-11-17 04:25:28.494845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.911 [2024-11-17 04:25:28.494903] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5636096 len:1 00:08:49.911 [2024-11-17 04:25:28.494927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.911 [2024-11-17 04:25:28.495032] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:49.911 [2024-11-17 04:25:28.495054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.911 [2024-11-17 04:25:28.495168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:49.911 [2024-11-17 04:25:28.495189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.911 #22 NEW cov: 12476 ft: 14695 corp: 14/475b lim: 50 exec/s: 22 rss: 73Mb L: 41/50 MS: 1 ChangeByte- 00:08:49.911 [2024-11-17 04:25:28.564974] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:281471386386432 len:65536 00:08:49.911 [2024-11-17 04:25:28.565004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.911 [2024-11-17 04:25:28.565080] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65281 00:08:49.911 [2024-11-17 04:25:28.565095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.911 [2024-11-17 04:25:28.565196] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:49.911 [2024-11-17 04:25:28.565218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.911 [2024-11-17 04:25:28.565322] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:49.911 [2024-11-17 04:25:28.565344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.911 #23 NEW cov: 12476 ft: 14709 corp: 15/517b lim: 50 exec/s: 23 rss: 73Mb L: 42/50 MS: 1 InsertRepeatedBytes- 00:08:49.911 [2024-11-17 04:25:28.615145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2305843009918337024 len:1 00:08:49.911 [2024-11-17 04:25:28.615177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.911 [2024-11-17 04:25:28.615247] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:239526031130624 len:55770 00:08:49.911 [2024-11-17 04:25:28.615267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.911 [2024-11-17 04:25:28.615363] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:218 00:08:49.911 [2024-11-17 04:25:28.615386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.911 [2024-11-17 04:25:28.615489] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3640655872 len:1 00:08:49.911 [2024-11-17 04:25:28.615510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.911 #24 NEW cov: 12476 ft: 14726 corp: 16/562b lim: 50 exec/s: 24 rss: 73Mb L: 45/50 MS: 1 ChangeBit- 00:08:49.911 [2024-11-17 04:25:28.685060] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:6510615555426900570 len:23131 00:08:49.911 [2024-11-17 04:25:28.685090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.911 [2024-11-17 04:25:28.685169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:6510615555426900570 len:23131 00:08:49.911 [2024-11-17 04:25:28.685187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.911 #28 NEW cov: 12476 ft: 14750 corp: 17/584b lim: 50 exec/s: 28 rss: 73Mb L: 22/50 MS: 4 CrossOver-ChangeBit-EraseBytes-InsertRepeatedBytes- 00:08:49.911 [2024-11-17 04:25:28.735271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:49.911 [2024-11-17 04:25:28.735300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.911 [2024-11-17 04:25:28.735359] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7 len:1 00:08:49.911 [2024-11-17 04:25:28.735378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.171 #29 NEW cov: 12476 ft: 14784 corp: 18/613b lim: 50 exec/s: 29 rss: 73Mb L: 29/50 MS: 1 CrossOver- 00:08:50.171 [2024-11-17 04:25:28.785685] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:50.171 [2024-11-17 04:25:28.785720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.171 [2024-11-17 04:25:28.785767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:239526031130624 len:55770 00:08:50.172 [2024-11-17 04:25:28.785786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.172 [2024-11-17 04:25:28.785875] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:50.172 [2024-11-17 04:25:28.785896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.172 [2024-11-17 04:25:28.786001] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2048 len:1 00:08:50.172 [2024-11-17 04:25:28.786025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.172 #30 NEW cov: 12476 ft: 14812 corp: 19/658b lim: 50 exec/s: 30 rss: 73Mb L: 45/50 MS: 1 ChangeBit- 00:08:50.172 [2024-11-17 04:25:28.835489] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:63232 00:08:50.172 [2024-11-17 04:25:28.835518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.172 [2024-11-17 04:25:28.835590] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4278648831 len:1 00:08:50.172 [2024-11-17 04:25:28.835611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.172 #31 NEW cov: 12476 ft: 14839 corp: 20/687b lim: 50 exec/s: 31 rss: 73Mb L: 29/50 MS: 1 ShuffleBytes- 00:08:50.172 [2024-11-17 04:25:28.906097] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:50.172 [2024-11-17 04:25:28.906127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.172 [2024-11-17 04:25:28.906163] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:239526031130624 len:55770 00:08:50.172 [2024-11-17 04:25:28.906184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.172 [2024-11-17 04:25:28.906282] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:218 00:08:50.172 [2024-11-17 04:25:28.906304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.172 [2024-11-17 04:25:28.906417] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:989855744 len:1 00:08:50.172 [2024-11-17 04:25:28.906441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.172 #32 NEW cov: 12476 ft: 14880 corp: 21/732b lim: 50 exec/s: 32 rss: 73Mb L: 45/50 MS: 1 ChangeByte- 00:08:50.172 [2024-11-17 04:25:28.955799] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:63232 00:08:50.172 [2024-11-17 04:25:28.955831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.172 [2024-11-17 04:25:28.955897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4294967046 len:1 00:08:50.172 [2024-11-17 04:25:28.955916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.172 #33 NEW cov: 12476 ft: 14896 corp: 22/761b lim: 50 exec/s: 33 rss: 73Mb L: 29/50 MS: 1 ChangeByte- 00:08:50.431 [2024-11-17 04:25:29.006401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:63232 00:08:50.431 [2024-11-17 04:25:29.006432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.431 [2024-11-17 04:25:29.006504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:39561033957768966 len:35981 00:08:50.432 [2024-11-17 04:25:29.006525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.432 [2024-11-17 04:25:29.006627] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:10127624197330734220 len:35981 00:08:50.432 [2024-11-17 04:25:29.006650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.432 [2024-11-17 04:25:29.006739] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:50.432 [2024-11-17 04:25:29.006763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.432 #34 NEW cov: 12476 ft: 14904 corp: 23/805b lim: 50 exec/s: 34 rss: 73Mb L: 44/50 MS: 1 InsertRepeatedBytes- 00:08:50.432 [2024-11-17 04:25:29.056575] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:50.432 [2024-11-17 04:25:29.056605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.432 [2024-11-17 04:25:29.056684] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:239526031130624 len:55770 00:08:50.432 [2024-11-17 04:25:29.056706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.432 [2024-11-17 04:25:29.056795] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:50.432 [2024-11-17 04:25:29.056817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.432 [2024-11-17 04:25:29.056925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2048 len:65536 00:08:50.432 [2024-11-17 04:25:29.056947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.432 #35 NEW cov: 12476 ft: 14924 corp: 24/850b lim: 50 exec/s: 35 rss: 73Mb L: 45/50 MS: 1 ChangeBinInt- 00:08:50.432 [2024-11-17 04:25:29.126416] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:6510615555426900541 len:23131 00:08:50.432 [2024-11-17 04:25:29.126449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.432 [2024-11-17 04:25:29.126521] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:6510615555426900570 len:23131 00:08:50.432 [2024-11-17 04:25:29.126544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.432 #36 NEW cov: 12476 ft: 14966 corp: 25/872b lim: 50 exec/s: 36 rss: 73Mb L: 22/50 MS: 1 ChangeByte- 00:08:50.432 [2024-11-17 04:25:29.196883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:50.432 [2024-11-17 04:25:29.196912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.432 [2024-11-17 04:25:29.196978] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:239526031130624 len:55770 00:08:50.432 [2024-11-17 04:25:29.196996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.432 [2024-11-17 04:25:29.197044] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744069414584575 len:65536 00:08:50.432 [2024-11-17 04:25:29.197063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.432 [2024-11-17 04:25:29.197162] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:4278192128 len:65536 00:08:50.432 [2024-11-17 04:25:29.197182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.432 #37 NEW cov: 12476 ft: 15024 corp: 26/917b lim: 50 exec/s: 37 rss: 74Mb L: 45/50 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:50.692 [2024-11-17 04:25:29.267254] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:50.692 [2024-11-17 04:25:29.267284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.692 [2024-11-17 04:25:29.267356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:239526031130624 len:1 00:08:50.692 [2024-11-17 04:25:29.267371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.692 [2024-11-17 04:25:29.267461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744069431361535 len:65281 00:08:50.692 [2024-11-17 04:25:29.267478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.692 [2024-11-17 04:25:29.267575] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:281470815961088 len:1 00:08:50.692 [2024-11-17 04:25:29.267596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.692 #38 NEW cov: 12476 ft: 15055 corp: 27/960b lim: 50 exec/s: 38 rss: 74Mb L: 43/50 MS: 1 EraseBytes- 00:08:50.692 [2024-11-17 04:25:29.337357] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:63232 00:08:50.692 [2024-11-17 04:25:29.337384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.692 [2024-11-17 04:25:29.337464] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:39561033957768966 len:35981 00:08:50.692 [2024-11-17 04:25:29.337480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.692 [2024-11-17 04:25:29.337580] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:10127624197330734220 len:35981 00:08:50.692 [2024-11-17 04:25:29.337599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.692 [2024-11-17 04:25:29.337706] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:23042 00:08:50.692 [2024-11-17 04:25:29.337731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.692 #39 NEW cov: 12476 ft: 15065 corp: 28/1008b lim: 50 exec/s: 39 rss: 74Mb L: 48/50 MS: 1 CMP- DE: "Z\001\000\000"- 00:08:50.692 [2024-11-17 04:25:29.407648] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:50.692 [2024-11-17 04:25:29.407680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.692 [2024-11-17 04:25:29.407754] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5636096 len:1 00:08:50.692 [2024-11-17 04:25:29.407770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.692 [2024-11-17 04:25:29.407873] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:24206847997116416 len:1 00:08:50.692 [2024-11-17 04:25:29.407895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.692 [2024-11-17 04:25:29.408003] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:50.692 [2024-11-17 04:25:29.408026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.692 #40 NEW cov: 12476 ft: 15080 corp: 29/1049b lim: 50 exec/s: 40 rss: 74Mb L: 41/50 MS: 1 CopyPart- 00:08:50.692 [2024-11-17 04:25:29.477435] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:63232 00:08:50.692 [2024-11-17 04:25:29.477465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.692 [2024-11-17 04:25:29.477508] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4294967103 len:1 00:08:50.692 [2024-11-17 04:25:29.477528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.692 #41 NEW cov: 12476 ft: 15099 corp: 30/1078b lim: 50 exec/s: 20 rss: 74Mb L: 29/50 MS: 1 ChangeByte- 00:08:50.692 #41 DONE cov: 12476 ft: 15099 corp: 30/1078b lim: 50 exec/s: 20 rss: 74Mb 00:08:50.692 ###### Recommended dictionary. ###### 00:08:50.692 "\377\377\377\377\377\377\377\377" # Uses: 0 00:08:50.692 "Z\001\000\000" # Uses: 0 00:08:50.692 ###### End of recommended dictionary. ###### 00:08:50.692 Done 41 runs in 2 second(s) 00:08:50.952 04:25:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_19.conf /var/tmp/suppress_nvmf_fuzz 00:08:50.952 04:25:29 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:50.952 04:25:29 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:50.952 04:25:29 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:50.952 04:25:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:50.952 04:25:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:50.952 04:25:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:50.952 04:25:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:50.952 04:25:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:50.952 04:25:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:50.952 04:25:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:50.952 04:25:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 20 00:08:50.952 04:25:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4420 00:08:50.952 04:25:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:50.952 04:25:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:50.952 04:25:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:50.952 04:25:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:50.952 04:25:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:50.953 04:25:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 00:08:50.953 [2024-11-17 04:25:29.664648] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:50.953 [2024-11-17 04:25:29.664739] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid157320 ] 00:08:51.213 [2024-11-17 04:25:29.858091] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:51.213 [2024-11-17 04:25:29.870422] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:51.213 [2024-11-17 04:25:29.922796] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:51.213 [2024-11-17 04:25:29.939116] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:51.213 INFO: Running with entropic power schedule (0xFF, 100). 00:08:51.213 INFO: Seed: 789247 00:08:51.213 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:51.213 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:51.213 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:51.213 INFO: A corpus is not provided, starting from an empty corpus 00:08:51.213 #2 INITED exec/s: 0 rss: 65Mb 00:08:51.213 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:51.213 This may also happen if the target rejected all inputs we tried so far 00:08:51.213 [2024-11-17 04:25:29.983952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:51.213 [2024-11-17 04:25:29.983987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.213 [2024-11-17 04:25:29.984038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:51.213 [2024-11-17 04:25:29.984057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.733 NEW_FUNC[1/717]: 0x474ab8 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:51.733 NEW_FUNC[2/717]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:51.733 #4 NEW cov: 12304 ft: 12303 corp: 2/42b lim: 90 exec/s: 0 rss: 72Mb L: 41/41 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:51.733 [2024-11-17 04:25:30.354908] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:51.733 [2024-11-17 04:25:30.354953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.733 [2024-11-17 04:25:30.354991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:51.733 [2024-11-17 04:25:30.355010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.733 #5 NEW cov: 12421 ft: 12943 corp: 3/83b lim: 90 exec/s: 0 rss: 72Mb L: 41/41 MS: 1 ShuffleBytes- 00:08:51.733 [2024-11-17 04:25:30.444925] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:51.733 [2024-11-17 04:25:30.444958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.733 #10 NEW cov: 12427 ft: 14046 corp: 4/101b lim: 90 exec/s: 0 rss: 72Mb L: 18/41 MS: 5 ChangeByte-CrossOver-CrossOver-InsertByte-InsertByte- 00:08:51.733 [2024-11-17 04:25:30.515091] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:51.733 [2024-11-17 04:25:30.515122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.733 #11 NEW cov: 12512 ft: 14212 corp: 5/120b lim: 90 exec/s: 0 rss: 72Mb L: 19/41 MS: 1 CrossOver- 00:08:51.993 [2024-11-17 04:25:30.575302] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:51.993 [2024-11-17 04:25:30.575332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.993 [2024-11-17 04:25:30.575380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:51.993 [2024-11-17 04:25:30.575399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.993 #12 NEW cov: 12512 ft: 14318 corp: 6/161b lim: 90 exec/s: 0 rss: 72Mb L: 41/41 MS: 1 ShuffleBytes- 00:08:51.993 [2024-11-17 04:25:30.635487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:51.993 [2024-11-17 04:25:30.635517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.993 [2024-11-17 04:25:30.635566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:51.993 [2024-11-17 04:25:30.635584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.993 #13 NEW cov: 12512 ft: 14347 corp: 7/202b lim: 90 exec/s: 0 rss: 72Mb L: 41/41 MS: 1 ChangeBit- 00:08:51.993 [2024-11-17 04:25:30.725745] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:51.993 [2024-11-17 04:25:30.725776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.993 [2024-11-17 04:25:30.725809] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:51.993 [2024-11-17 04:25:30.725827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.993 #14 NEW cov: 12512 ft: 14410 corp: 8/247b lim: 90 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 CMP- DE: "\031\000\000\000"- 00:08:51.993 [2024-11-17 04:25:30.785897] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:51.993 [2024-11-17 04:25:30.785926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.993 [2024-11-17 04:25:30.785972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:51.993 [2024-11-17 04:25:30.785989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.993 [2024-11-17 04:25:30.786020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:51.993 [2024-11-17 04:25:30.786036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.253 #15 NEW cov: 12512 ft: 14752 corp: 9/301b lim: 90 exec/s: 0 rss: 72Mb L: 54/54 MS: 1 CopyPart- 00:08:52.253 [2024-11-17 04:25:30.846059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.253 [2024-11-17 04:25:30.846090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.253 [2024-11-17 04:25:30.846126] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:52.253 [2024-11-17 04:25:30.846145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.253 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:52.253 #16 NEW cov: 12535 ft: 14828 corp: 10/342b lim: 90 exec/s: 0 rss: 72Mb L: 41/54 MS: 1 ChangeBit- 00:08:52.253 [2024-11-17 04:25:30.906118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.253 [2024-11-17 04:25:30.906147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.253 #17 NEW cov: 12535 ft: 14949 corp: 11/363b lim: 90 exec/s: 17 rss: 72Mb L: 21/54 MS: 1 EraseBytes- 00:08:52.253 [2024-11-17 04:25:30.996405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.253 [2024-11-17 04:25:30.996434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.253 [2024-11-17 04:25:30.996481] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:52.253 [2024-11-17 04:25:30.996498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.253 #18 NEW cov: 12535 ft: 15048 corp: 12/408b lim: 90 exec/s: 18 rss: 72Mb L: 45/54 MS: 1 CrossOver- 00:08:52.254 [2024-11-17 04:25:31.077127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.254 [2024-11-17 04:25:31.077156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.514 #19 NEW cov: 12535 ft: 15092 corp: 13/427b lim: 90 exec/s: 19 rss: 72Mb L: 19/54 MS: 1 ChangeByte- 00:08:52.514 [2024-11-17 04:25:31.137262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.514 [2024-11-17 04:25:31.137290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.514 #20 NEW cov: 12535 ft: 15184 corp: 14/445b lim: 90 exec/s: 20 rss: 72Mb L: 18/54 MS: 1 ChangeByte- 00:08:52.514 [2024-11-17 04:25:31.197460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.514 [2024-11-17 04:25:31.197488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.514 #21 NEW cov: 12535 ft: 15292 corp: 15/467b lim: 90 exec/s: 21 rss: 72Mb L: 22/54 MS: 1 InsertRepeatedBytes- 00:08:52.514 [2024-11-17 04:25:31.237542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.514 [2024-11-17 04:25:31.237569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.514 #22 NEW cov: 12535 ft: 15333 corp: 16/489b lim: 90 exec/s: 22 rss: 72Mb L: 22/54 MS: 1 CopyPart- 00:08:52.514 [2024-11-17 04:25:31.297864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.514 [2024-11-17 04:25:31.297892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.514 [2024-11-17 04:25:31.297934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:52.514 [2024-11-17 04:25:31.297951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.514 #23 NEW cov: 12535 ft: 15359 corp: 17/530b lim: 90 exec/s: 23 rss: 72Mb L: 41/54 MS: 1 PersAutoDict- DE: "\031\000\000\000"- 00:08:52.514 [2024-11-17 04:25:31.338148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.514 [2024-11-17 04:25:31.338176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.514 [2024-11-17 04:25:31.338217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:52.514 [2024-11-17 04:25:31.338232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.514 [2024-11-17 04:25:31.338289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:52.514 [2024-11-17 04:25:31.338304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.774 #24 NEW cov: 12535 ft: 15419 corp: 18/588b lim: 90 exec/s: 24 rss: 72Mb L: 58/58 MS: 1 PersAutoDict- DE: "\031\000\000\000"- 00:08:52.774 [2024-11-17 04:25:31.398196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.774 [2024-11-17 04:25:31.398224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.774 [2024-11-17 04:25:31.398279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:52.774 [2024-11-17 04:25:31.398296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.774 #25 NEW cov: 12535 ft: 15431 corp: 19/629b lim: 90 exec/s: 25 rss: 72Mb L: 41/58 MS: 1 CMP- DE: "We\367\037\243v\212\000"- 00:08:52.774 [2024-11-17 04:25:31.458326] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.774 [2024-11-17 04:25:31.458352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.774 [2024-11-17 04:25:31.458390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:52.774 [2024-11-17 04:25:31.458405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.774 #26 NEW cov: 12535 ft: 15442 corp: 20/672b lim: 90 exec/s: 26 rss: 72Mb L: 43/58 MS: 1 InsertRepeatedBytes- 00:08:52.774 [2024-11-17 04:25:31.518317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.774 [2024-11-17 04:25:31.518343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.774 #27 NEW cov: 12535 ft: 15451 corp: 21/702b lim: 90 exec/s: 27 rss: 72Mb L: 30/58 MS: 1 PersAutoDict- DE: "We\367\037\243v\212\000"- 00:08:52.774 [2024-11-17 04:25:31.578667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.774 [2024-11-17 04:25:31.578705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.774 [2024-11-17 04:25:31.578778] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:52.774 [2024-11-17 04:25:31.578796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.774 #28 NEW cov: 12535 ft: 15509 corp: 22/755b lim: 90 exec/s: 28 rss: 72Mb L: 53/58 MS: 1 InsertRepeatedBytes- 00:08:53.034 [2024-11-17 04:25:31.618767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:53.034 [2024-11-17 04:25:31.618794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.034 [2024-11-17 04:25:31.618856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:53.034 [2024-11-17 04:25:31.618873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.034 #29 NEW cov: 12535 ft: 15533 corp: 23/808b lim: 90 exec/s: 29 rss: 73Mb L: 53/58 MS: 1 ChangeBinInt- 00:08:53.034 [2024-11-17 04:25:31.679082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:53.034 [2024-11-17 04:25:31.679110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.034 [2024-11-17 04:25:31.679147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:53.034 [2024-11-17 04:25:31.679163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.034 [2024-11-17 04:25:31.679217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:53.034 [2024-11-17 04:25:31.679231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.034 #30 NEW cov: 12535 ft: 15542 corp: 24/862b lim: 90 exec/s: 30 rss: 73Mb L: 54/58 MS: 1 InsertByte- 00:08:53.034 [2024-11-17 04:25:31.739301] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:53.034 [2024-11-17 04:25:31.739329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.034 [2024-11-17 04:25:31.739367] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:53.034 [2024-11-17 04:25:31.739382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.034 [2024-11-17 04:25:31.739438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:53.034 [2024-11-17 04:25:31.739471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.034 #31 NEW cov: 12535 ft: 15557 corp: 25/921b lim: 90 exec/s: 31 rss: 73Mb L: 59/59 MS: 1 InsertByte- 00:08:53.034 [2024-11-17 04:25:31.799133] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:53.034 [2024-11-17 04:25:31.799161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.034 #37 NEW cov: 12535 ft: 15603 corp: 26/951b lim: 90 exec/s: 37 rss: 73Mb L: 30/59 MS: 1 ChangeByte- 00:08:53.035 [2024-11-17 04:25:31.859773] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:53.035 [2024-11-17 04:25:31.859802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.035 [2024-11-17 04:25:31.859853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:53.035 [2024-11-17 04:25:31.859875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.035 [2024-11-17 04:25:31.859930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:53.035 [2024-11-17 04:25:31.859946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.035 [2024-11-17 04:25:31.860004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:53.035 [2024-11-17 04:25:31.860020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.295 #38 NEW cov: 12535 ft: 15968 corp: 27/1030b lim: 90 exec/s: 38 rss: 73Mb L: 79/79 MS: 1 CopyPart- 00:08:53.295 [2024-11-17 04:25:31.919775] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:53.295 [2024-11-17 04:25:31.919802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.295 [2024-11-17 04:25:31.919867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:53.295 [2024-11-17 04:25:31.919882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.295 [2024-11-17 04:25:31.919937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:53.295 [2024-11-17 04:25:31.919951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.295 #39 NEW cov: 12535 ft: 15984 corp: 28/1084b lim: 90 exec/s: 39 rss: 73Mb L: 54/79 MS: 1 ChangeBit- 00:08:53.296 [2024-11-17 04:25:31.979787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:53.296 [2024-11-17 04:25:31.979816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.296 [2024-11-17 04:25:31.979895] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:53.296 [2024-11-17 04:25:31.979911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.296 #40 NEW cov: 12535 ft: 15994 corp: 29/1137b lim: 90 exec/s: 20 rss: 73Mb L: 53/79 MS: 1 ChangeBit- 00:08:53.296 #40 DONE cov: 12535 ft: 15994 corp: 29/1137b lim: 90 exec/s: 20 rss: 73Mb 00:08:53.296 ###### Recommended dictionary. ###### 00:08:53.296 "\031\000\000\000" # Uses: 3 00:08:53.296 "We\367\037\243v\212\000" # Uses: 2 00:08:53.296 ###### End of recommended dictionary. ###### 00:08:53.296 Done 40 runs in 2 second(s) 00:08:53.296 04:25:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_20.conf /var/tmp/suppress_nvmf_fuzz 00:08:53.296 04:25:32 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:53.296 04:25:32 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:53.296 04:25:32 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:53.296 04:25:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:53.296 04:25:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:53.296 04:25:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:53.296 04:25:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:53.296 04:25:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:53.296 04:25:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:53.296 04:25:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:53.296 04:25:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 21 00:08:53.296 04:25:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4421 00:08:53.296 04:25:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:53.296 04:25:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:53.296 04:25:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:53.296 04:25:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:53.296 04:25:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:53.296 04:25:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 00:08:53.556 [2024-11-17 04:25:32.148230] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:53.556 [2024-11-17 04:25:32.148322] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid157797 ] 00:08:53.556 [2024-11-17 04:25:32.346038] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:53.556 [2024-11-17 04:25:32.358929] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:53.816 [2024-11-17 04:25:32.411529] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:53.816 [2024-11-17 04:25:32.427847] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:53.816 INFO: Running with entropic power schedule (0xFF, 100). 00:08:53.816 INFO: Seed: 2488789943 00:08:53.816 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:53.816 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:53.816 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:53.816 INFO: A corpus is not provided, starting from an empty corpus 00:08:53.816 #2 INITED exec/s: 0 rss: 65Mb 00:08:53.816 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:53.816 This may also happen if the target rejected all inputs we tried so far 00:08:53.816 [2024-11-17 04:25:32.472577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:53.816 [2024-11-17 04:25:32.472612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.076 NEW_FUNC[1/717]: 0x477ce8 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:54.076 NEW_FUNC[2/717]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:54.076 #19 NEW cov: 12278 ft: 12277 corp: 2/18b lim: 50 exec/s: 0 rss: 72Mb L: 17/17 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:54.076 [2024-11-17 04:25:32.833606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.076 [2024-11-17 04:25:32.833646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.076 [2024-11-17 04:25:32.833684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.076 [2024-11-17 04:25:32.833710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.337 #20 NEW cov: 12396 ft: 13681 corp: 3/43b lim: 50 exec/s: 0 rss: 72Mb L: 25/25 MS: 1 CrossOver- 00:08:54.337 [2024-11-17 04:25:32.923620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.337 [2024-11-17 04:25:32.923652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.337 #21 NEW cov: 12402 ft: 13994 corp: 4/61b lim: 50 exec/s: 0 rss: 72Mb L: 18/25 MS: 1 InsertByte- 00:08:54.337 [2024-11-17 04:25:32.983812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.337 [2024-11-17 04:25:32.983842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.337 [2024-11-17 04:25:32.983875] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.337 [2024-11-17 04:25:32.983893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.337 #22 NEW cov: 12487 ft: 14226 corp: 5/86b lim: 50 exec/s: 0 rss: 72Mb L: 25/25 MS: 1 CopyPart- 00:08:54.337 [2024-11-17 04:25:33.074076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.337 [2024-11-17 04:25:33.074106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.337 [2024-11-17 04:25:33.074155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.337 [2024-11-17 04:25:33.074172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.337 #23 NEW cov: 12487 ft: 14293 corp: 6/112b lim: 50 exec/s: 0 rss: 72Mb L: 26/26 MS: 1 InsertByte- 00:08:54.337 [2024-11-17 04:25:33.164331] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.337 [2024-11-17 04:25:33.164361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.337 [2024-11-17 04:25:33.164395] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.337 [2024-11-17 04:25:33.164413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.598 #24 NEW cov: 12487 ft: 14406 corp: 7/137b lim: 50 exec/s: 0 rss: 72Mb L: 25/26 MS: 1 ChangeBit- 00:08:54.598 [2024-11-17 04:25:33.224387] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.598 [2024-11-17 04:25:33.224416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.598 #25 NEW cov: 12487 ft: 14520 corp: 8/149b lim: 50 exec/s: 0 rss: 72Mb L: 12/26 MS: 1 EraseBytes- 00:08:54.598 [2024-11-17 04:25:33.314777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.598 [2024-11-17 04:25:33.314807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.598 [2024-11-17 04:25:33.314856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.598 [2024-11-17 04:25:33.314874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.598 [2024-11-17 04:25:33.314905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.598 [2024-11-17 04:25:33.314921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.598 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:54.598 #26 NEW cov: 12510 ft: 14913 corp: 9/179b lim: 50 exec/s: 0 rss: 73Mb L: 30/30 MS: 1 CMP- DE: "\031\000\000\000"- 00:08:54.598 [2024-11-17 04:25:33.404842] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.598 [2024-11-17 04:25:33.404872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.858 #27 NEW cov: 12510 ft: 14967 corp: 10/191b lim: 50 exec/s: 27 rss: 73Mb L: 12/30 MS: 1 ChangeBinInt- 00:08:54.858 [2024-11-17 04:25:33.495178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.858 [2024-11-17 04:25:33.495209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.858 [2024-11-17 04:25:33.495242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.858 [2024-11-17 04:25:33.495259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.858 #30 NEW cov: 12510 ft: 15006 corp: 11/215b lim: 50 exec/s: 30 rss: 73Mb L: 24/30 MS: 3 CrossOver-CrossOver-InsertRepeatedBytes- 00:08:54.858 [2024-11-17 04:25:33.555238] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.858 [2024-11-17 04:25:33.555269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.858 #31 NEW cov: 12510 ft: 15025 corp: 12/232b lim: 50 exec/s: 31 rss: 73Mb L: 17/30 MS: 1 ChangeBit- 00:08:54.858 [2024-11-17 04:25:33.605363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.858 [2024-11-17 04:25:33.605393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.858 #32 NEW cov: 12519 ft: 15051 corp: 13/249b lim: 50 exec/s: 32 rss: 73Mb L: 17/30 MS: 1 ChangeBit- 00:08:54.858 [2024-11-17 04:25:33.655545] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.858 [2024-11-17 04:25:33.655576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.858 [2024-11-17 04:25:33.655609] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.858 [2024-11-17 04:25:33.655627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.119 #33 NEW cov: 12519 ft: 15066 corp: 14/276b lim: 50 exec/s: 33 rss: 73Mb L: 27/30 MS: 1 InsertByte- 00:08:55.119 [2024-11-17 04:25:33.715765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.119 [2024-11-17 04:25:33.715796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.119 [2024-11-17 04:25:33.715845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:55.119 [2024-11-17 04:25:33.715862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.119 #34 NEW cov: 12519 ft: 15116 corp: 15/301b lim: 50 exec/s: 34 rss: 73Mb L: 25/30 MS: 1 ChangeBit- 00:08:55.119 [2024-11-17 04:25:33.806057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.119 [2024-11-17 04:25:33.806087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.119 [2024-11-17 04:25:33.806120] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:55.119 [2024-11-17 04:25:33.806137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.119 [2024-11-17 04:25:33.806166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:55.119 [2024-11-17 04:25:33.806182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.119 #35 NEW cov: 12519 ft: 15186 corp: 16/338b lim: 50 exec/s: 35 rss: 73Mb L: 37/37 MS: 1 CrossOver- 00:08:55.119 [2024-11-17 04:25:33.896225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.119 [2024-11-17 04:25:33.896255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.119 [2024-11-17 04:25:33.896293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:55.119 [2024-11-17 04:25:33.896311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.119 #36 NEW cov: 12519 ft: 15212 corp: 17/367b lim: 50 exec/s: 36 rss: 73Mb L: 29/37 MS: 1 InsertRepeatedBytes- 00:08:55.119 [2024-11-17 04:25:33.946346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.119 [2024-11-17 04:25:33.946377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.119 [2024-11-17 04:25:33.946412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:55.119 [2024-11-17 04:25:33.946432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.379 #37 NEW cov: 12519 ft: 15256 corp: 18/392b lim: 50 exec/s: 37 rss: 73Mb L: 25/37 MS: 1 ChangeByte- 00:08:55.379 [2024-11-17 04:25:34.007109] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.379 [2024-11-17 04:25:34.007138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.379 [2024-11-17 04:25:34.007194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:55.379 [2024-11-17 04:25:34.007209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.379 #38 NEW cov: 12519 ft: 15354 corp: 19/418b lim: 50 exec/s: 38 rss: 73Mb L: 26/37 MS: 1 PersAutoDict- DE: "\031\000\000\000"- 00:08:55.379 [2024-11-17 04:25:34.047063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.379 [2024-11-17 04:25:34.047091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.379 #39 NEW cov: 12519 ft: 15378 corp: 20/437b lim: 50 exec/s: 39 rss: 73Mb L: 19/37 MS: 1 InsertByte- 00:08:55.379 [2024-11-17 04:25:34.087353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.379 [2024-11-17 04:25:34.087379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.379 [2024-11-17 04:25:34.087416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:55.379 [2024-11-17 04:25:34.087433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.379 #40 NEW cov: 12519 ft: 15443 corp: 21/462b lim: 50 exec/s: 40 rss: 73Mb L: 25/37 MS: 1 ChangeByte- 00:08:55.379 [2024-11-17 04:25:34.148154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.379 [2024-11-17 04:25:34.148233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.379 [2024-11-17 04:25:34.148353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:55.379 [2024-11-17 04:25:34.148399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.379 [2024-11-17 04:25:34.148513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:55.379 [2024-11-17 04:25:34.148557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.379 #41 NEW cov: 12519 ft: 15567 corp: 22/495b lim: 50 exec/s: 41 rss: 73Mb L: 33/37 MS: 1 CrossOver- 00:08:55.379 [2024-11-17 04:25:34.197667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.379 [2024-11-17 04:25:34.197700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.379 [2024-11-17 04:25:34.197757] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:55.379 [2024-11-17 04:25:34.197773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.640 #42 NEW cov: 12519 ft: 15598 corp: 23/516b lim: 50 exec/s: 42 rss: 73Mb L: 21/37 MS: 1 EraseBytes- 00:08:55.640 [2024-11-17 04:25:34.257850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.640 [2024-11-17 04:25:34.257876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.640 [2024-11-17 04:25:34.257931] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:55.640 [2024-11-17 04:25:34.257947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.640 #43 NEW cov: 12519 ft: 15604 corp: 24/537b lim: 50 exec/s: 43 rss: 73Mb L: 21/37 MS: 1 ShuffleBytes- 00:08:55.640 [2024-11-17 04:25:34.318073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.640 [2024-11-17 04:25:34.318100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.640 [2024-11-17 04:25:34.318161] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:55.640 [2024-11-17 04:25:34.318178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.640 #44 NEW cov: 12519 ft: 15645 corp: 25/562b lim: 50 exec/s: 44 rss: 73Mb L: 25/37 MS: 1 PersAutoDict- DE: "\031\000\000\000"- 00:08:55.640 [2024-11-17 04:25:34.358276] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.640 [2024-11-17 04:25:34.358301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.640 [2024-11-17 04:25:34.358336] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:55.640 [2024-11-17 04:25:34.358352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.640 [2024-11-17 04:25:34.358407] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:55.640 [2024-11-17 04:25:34.358437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.640 #45 NEW cov: 12519 ft: 15648 corp: 26/599b lim: 50 exec/s: 45 rss: 73Mb L: 37/37 MS: 1 ShuffleBytes- 00:08:55.640 [2024-11-17 04:25:34.418165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.640 [2024-11-17 04:25:34.418194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.640 #46 NEW cov: 12519 ft: 15720 corp: 27/617b lim: 50 exec/s: 46 rss: 73Mb L: 18/37 MS: 1 CrossOver- 00:08:55.640 [2024-11-17 04:25:34.458711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.640 [2024-11-17 04:25:34.458738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.640 [2024-11-17 04:25:34.458788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:55.640 [2024-11-17 04:25:34.458804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.640 [2024-11-17 04:25:34.458859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:55.640 [2024-11-17 04:25:34.458877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.640 [2024-11-17 04:25:34.458933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:55.640 [2024-11-17 04:25:34.458949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.901 #47 NEW cov: 12519 ft: 16061 corp: 28/657b lim: 50 exec/s: 23 rss: 73Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:55.901 #47 DONE cov: 12519 ft: 16061 corp: 28/657b lim: 50 exec/s: 23 rss: 73Mb 00:08:55.901 ###### Recommended dictionary. ###### 00:08:55.901 "\031\000\000\000" # Uses: 2 00:08:55.901 ###### End of recommended dictionary. ###### 00:08:55.901 Done 47 runs in 2 second(s) 00:08:55.901 04:25:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_21.conf /var/tmp/suppress_nvmf_fuzz 00:08:55.901 04:25:34 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:55.901 04:25:34 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:55.901 04:25:34 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:55.901 04:25:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:55.901 04:25:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:55.901 04:25:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:55.901 04:25:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:55.901 04:25:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:55.901 04:25:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:55.901 04:25:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:55.901 04:25:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 22 00:08:55.901 04:25:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4422 00:08:55.901 04:25:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:55.901 04:25:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:55.901 04:25:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:55.901 04:25:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:55.901 04:25:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:55.901 04:25:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 00:08:55.901 [2024-11-17 04:25:34.647550] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:55.901 [2024-11-17 04:25:34.647635] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid158137 ] 00:08:56.161 [2024-11-17 04:25:34.848203] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:56.161 [2024-11-17 04:25:34.862033] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:56.161 [2024-11-17 04:25:34.914641] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:56.161 [2024-11-17 04:25:34.930982] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:56.161 INFO: Running with entropic power schedule (0xFF, 100). 00:08:56.161 INFO: Seed: 697820570 00:08:56.161 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:56.161 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:56.161 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:56.161 INFO: A corpus is not provided, starting from an empty corpus 00:08:56.161 #2 INITED exec/s: 0 rss: 66Mb 00:08:56.161 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:56.161 This may also happen if the target rejected all inputs we tried so far 00:08:56.422 [2024-11-17 04:25:35.007057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.422 [2024-11-17 04:25:35.007099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.682 NEW_FUNC[1/717]: 0x479fb8 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:56.682 NEW_FUNC[2/717]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:56.682 #5 NEW cov: 12291 ft: 12292 corp: 2/24b lim: 85 exec/s: 0 rss: 72Mb L: 23/23 MS: 3 CopyPart-EraseBytes-InsertRepeatedBytes- 00:08:56.682 [2024-11-17 04:25:35.347923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.682 [2024-11-17 04:25:35.347981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.682 #6 NEW cov: 12421 ft: 12940 corp: 3/47b lim: 85 exec/s: 0 rss: 72Mb L: 23/23 MS: 1 CMP- DE: "FB\305i\245v\212\000"- 00:08:56.682 [2024-11-17 04:25:35.418050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.682 [2024-11-17 04:25:35.418084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.682 #8 NEW cov: 12427 ft: 13254 corp: 4/68b lim: 85 exec/s: 0 rss: 72Mb L: 21/23 MS: 2 CrossOver-CrossOver- 00:08:56.682 [2024-11-17 04:25:35.488217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.682 [2024-11-17 04:25:35.488249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.682 #10 NEW cov: 12512 ft: 13516 corp: 5/89b lim: 85 exec/s: 0 rss: 72Mb L: 21/23 MS: 2 CrossOver-CrossOver- 00:08:56.942 [2024-11-17 04:25:35.538279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.942 [2024-11-17 04:25:35.538306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.942 #11 NEW cov: 12512 ft: 13675 corp: 6/111b lim: 85 exec/s: 0 rss: 72Mb L: 22/23 MS: 1 InsertByte- 00:08:56.942 [2024-11-17 04:25:35.608606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.942 [2024-11-17 04:25:35.608640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.942 #12 NEW cov: 12512 ft: 13757 corp: 7/132b lim: 85 exec/s: 0 rss: 72Mb L: 21/23 MS: 1 ChangeBinInt- 00:08:56.942 [2024-11-17 04:25:35.658703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.942 [2024-11-17 04:25:35.658734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.942 #13 NEW cov: 12512 ft: 13794 corp: 8/155b lim: 85 exec/s: 0 rss: 72Mb L: 23/23 MS: 1 CopyPart- 00:08:56.942 [2024-11-17 04:25:35.708807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.942 [2024-11-17 04:25:35.708840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.942 #14 NEW cov: 12512 ft: 13822 corp: 9/178b lim: 85 exec/s: 0 rss: 73Mb L: 23/23 MS: 1 ShuffleBytes- 00:08:57.202 [2024-11-17 04:25:35.779094] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.202 [2024-11-17 04:25:35.779124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.202 #15 NEW cov: 12512 ft: 13881 corp: 10/209b lim: 85 exec/s: 0 rss: 73Mb L: 31/31 MS: 1 PersAutoDict- DE: "FB\305i\245v\212\000"- 00:08:57.202 [2024-11-17 04:25:35.829518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.202 [2024-11-17 04:25:35.829555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.202 [2024-11-17 04:25:35.829688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:57.202 [2024-11-17 04:25:35.829717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.202 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:57.202 #17 NEW cov: 12535 ft: 14727 corp: 11/252b lim: 85 exec/s: 0 rss: 73Mb L: 43/43 MS: 2 EraseBytes-InsertRepeatedBytes- 00:08:57.202 [2024-11-17 04:25:35.899536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.202 [2024-11-17 04:25:35.899563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.202 #18 NEW cov: 12535 ft: 14781 corp: 12/275b lim: 85 exec/s: 0 rss: 73Mb L: 23/43 MS: 1 ChangeBit- 00:08:57.202 [2024-11-17 04:25:35.969856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.202 [2024-11-17 04:25:35.969887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.202 [2024-11-17 04:25:35.970003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:57.202 [2024-11-17 04:25:35.970027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.202 #19 NEW cov: 12535 ft: 14864 corp: 13/310b lim: 85 exec/s: 19 rss: 73Mb L: 35/43 MS: 1 InsertRepeatedBytes- 00:08:57.202 [2024-11-17 04:25:36.019836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.203 [2024-11-17 04:25:36.019862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.463 #20 NEW cov: 12535 ft: 14916 corp: 14/340b lim: 85 exec/s: 20 rss: 73Mb L: 30/43 MS: 1 CrossOver- 00:08:57.463 [2024-11-17 04:25:36.070049] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.463 [2024-11-17 04:25:36.070076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.463 #21 NEW cov: 12535 ft: 14968 corp: 15/363b lim: 85 exec/s: 21 rss: 73Mb L: 23/43 MS: 1 ShuffleBytes- 00:08:57.463 [2024-11-17 04:25:36.120175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.463 [2024-11-17 04:25:36.120202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.463 #22 NEW cov: 12535 ft: 14979 corp: 16/386b lim: 85 exec/s: 22 rss: 73Mb L: 23/43 MS: 1 ChangeByte- 00:08:57.463 [2024-11-17 04:25:36.190410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.463 [2024-11-17 04:25:36.190436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.463 #23 NEW cov: 12535 ft: 15006 corp: 17/407b lim: 85 exec/s: 23 rss: 73Mb L: 21/43 MS: 1 ChangeBinInt- 00:08:57.463 [2024-11-17 04:25:36.240486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.463 [2024-11-17 04:25:36.240516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.463 #24 NEW cov: 12535 ft: 15019 corp: 18/438b lim: 85 exec/s: 24 rss: 73Mb L: 31/43 MS: 1 PersAutoDict- DE: "FB\305i\245v\212\000"- 00:08:57.463 [2024-11-17 04:25:36.290755] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.463 [2024-11-17 04:25:36.290790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.723 #25 NEW cov: 12535 ft: 15093 corp: 19/461b lim: 85 exec/s: 25 rss: 73Mb L: 23/43 MS: 1 PersAutoDict- DE: "FB\305i\245v\212\000"- 00:08:57.723 [2024-11-17 04:25:36.340863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.723 [2024-11-17 04:25:36.340897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.723 #26 NEW cov: 12535 ft: 15098 corp: 20/491b lim: 85 exec/s: 26 rss: 73Mb L: 30/43 MS: 1 PersAutoDict- DE: "FB\305i\245v\212\000"- 00:08:57.723 [2024-11-17 04:25:36.411056] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.723 [2024-11-17 04:25:36.411083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.723 #27 NEW cov: 12535 ft: 15126 corp: 21/512b lim: 85 exec/s: 27 rss: 73Mb L: 21/43 MS: 1 ChangeByte- 00:08:57.723 [2024-11-17 04:25:36.461240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.723 [2024-11-17 04:25:36.461268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.723 #28 NEW cov: 12535 ft: 15218 corp: 22/533b lim: 85 exec/s: 28 rss: 73Mb L: 21/43 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:57.723 [2024-11-17 04:25:36.531413] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.723 [2024-11-17 04:25:36.531441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.723 #33 NEW cov: 12535 ft: 15227 corp: 23/556b lim: 85 exec/s: 33 rss: 73Mb L: 23/43 MS: 5 ChangeByte-ChangeBit-InsertByte-InsertRepeatedBytes-CMP- DE: "\000\000\000\000\000\000\000\006"- 00:08:57.983 [2024-11-17 04:25:36.581565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.983 [2024-11-17 04:25:36.581592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.983 #34 NEW cov: 12535 ft: 15314 corp: 24/579b lim: 85 exec/s: 34 rss: 73Mb L: 23/43 MS: 1 ChangeBinInt- 00:08:57.983 [2024-11-17 04:25:36.651783] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.983 [2024-11-17 04:25:36.651814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.983 #35 NEW cov: 12535 ft: 15330 corp: 25/602b lim: 85 exec/s: 35 rss: 73Mb L: 23/43 MS: 1 ChangeBinInt- 00:08:57.983 [2024-11-17 04:25:36.701918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.983 [2024-11-17 04:25:36.701954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.983 #36 NEW cov: 12535 ft: 15345 corp: 26/633b lim: 85 exec/s: 36 rss: 73Mb L: 31/43 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\006"- 00:08:57.983 [2024-11-17 04:25:36.772124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.983 [2024-11-17 04:25:36.772153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.983 #37 NEW cov: 12535 ft: 15365 corp: 27/651b lim: 85 exec/s: 37 rss: 73Mb L: 18/43 MS: 1 EraseBytes- 00:08:58.243 [2024-11-17 04:25:36.822290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:58.243 [2024-11-17 04:25:36.822323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:58.244 #38 NEW cov: 12535 ft: 15376 corp: 28/670b lim: 85 exec/s: 38 rss: 73Mb L: 19/43 MS: 1 EraseBytes- 00:08:58.244 [2024-11-17 04:25:36.872390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:58.244 [2024-11-17 04:25:36.872418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:58.244 #39 NEW cov: 12535 ft: 15408 corp: 29/703b lim: 85 exec/s: 39 rss: 74Mb L: 33/43 MS: 1 CMP- DE: "\0008"- 00:08:58.244 [2024-11-17 04:25:36.942645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:58.244 [2024-11-17 04:25:36.942671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:58.244 #40 NEW cov: 12535 ft: 15426 corp: 30/724b lim: 85 exec/s: 40 rss: 74Mb L: 21/43 MS: 1 PersAutoDict- DE: "\0008"- 00:08:58.244 [2024-11-17 04:25:36.992841] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:58.244 [2024-11-17 04:25:36.992866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:58.244 #41 NEW cov: 12535 ft: 15439 corp: 31/754b lim: 85 exec/s: 20 rss: 74Mb L: 30/43 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:58.244 #41 DONE cov: 12535 ft: 15439 corp: 31/754b lim: 85 exec/s: 20 rss: 74Mb 00:08:58.244 ###### Recommended dictionary. ###### 00:08:58.244 "FB\305i\245v\212\000" # Uses: 4 00:08:58.244 "\377\377\377\377\377\377\377\377" # Uses: 0 00:08:58.244 "\000\000\000\000\000\000\000\006" # Uses: 1 00:08:58.244 "\0008" # Uses: 1 00:08:58.244 "\001\000\000\000\000\000\000\000" # Uses: 0 00:08:58.244 ###### End of recommended dictionary. ###### 00:08:58.244 Done 41 runs in 2 second(s) 00:08:58.504 04:25:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_22.conf /var/tmp/suppress_nvmf_fuzz 00:08:58.504 04:25:37 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:58.504 04:25:37 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:58.504 04:25:37 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:58.504 04:25:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:58.504 04:25:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:58.504 04:25:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:58.504 04:25:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:58.504 04:25:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:58.504 04:25:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:58.504 04:25:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:58.504 04:25:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 23 00:08:58.504 04:25:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4423 00:08:58.504 04:25:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:58.504 04:25:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:58.504 04:25:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:58.504 04:25:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:58.504 04:25:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:58.504 04:25:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 00:08:58.504 [2024-11-17 04:25:37.158639] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:08:58.504 [2024-11-17 04:25:37.158730] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid158664 ] 00:08:58.764 [2024-11-17 04:25:37.356740] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:58.764 [2024-11-17 04:25:37.369366] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:58.764 [2024-11-17 04:25:37.421867] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:58.765 [2024-11-17 04:25:37.438155] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:58.765 INFO: Running with entropic power schedule (0xFF, 100). 00:08:58.765 INFO: Seed: 3204816009 00:08:58.765 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:08:58.765 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:08:58.765 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:58.765 INFO: A corpus is not provided, starting from an empty corpus 00:08:58.765 #2 INITED exec/s: 0 rss: 65Mb 00:08:58.765 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:58.765 This may also happen if the target rejected all inputs we tried so far 00:08:58.765 [2024-11-17 04:25:37.483002] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:58.765 [2024-11-17 04:25:37.483034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:58.765 [2024-11-17 04:25:37.483085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:58.765 [2024-11-17 04:25:37.483102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.024 NEW_FUNC[1/716]: 0x47d1f8 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:59.024 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:59.024 #4 NEW cov: 12242 ft: 12241 corp: 2/13b lim: 25 exec/s: 0 rss: 71Mb L: 12/12 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:59.024 [2024-11-17 04:25:37.844017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.024 [2024-11-17 04:25:37.844058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.024 [2024-11-17 04:25:37.844092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.024 [2024-11-17 04:25:37.844110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.024 [2024-11-17 04:25:37.844140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:59.024 [2024-11-17 04:25:37.844156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:59.024 [2024-11-17 04:25:37.844184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:59.024 [2024-11-17 04:25:37.844200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:59.024 [2024-11-17 04:25:37.844228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:59.024 [2024-11-17 04:25:37.844244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:59.284 #13 NEW cov: 12355 ft: 13325 corp: 3/38b lim: 25 exec/s: 0 rss: 72Mb L: 25/25 MS: 4 CrossOver-InsertByte-ChangeBinInt-InsertRepeatedBytes- 00:08:59.284 [2024-11-17 04:25:37.903870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.284 [2024-11-17 04:25:37.903901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.284 [2024-11-17 04:25:37.903949] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.284 [2024-11-17 04:25:37.903968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.284 #14 NEW cov: 12361 ft: 13720 corp: 4/50b lim: 25 exec/s: 0 rss: 72Mb L: 12/25 MS: 1 CrossOver- 00:08:59.284 [2024-11-17 04:25:37.994227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.284 [2024-11-17 04:25:37.994258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.284 [2024-11-17 04:25:37.994304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.284 [2024-11-17 04:25:37.994321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.284 [2024-11-17 04:25:37.994351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:59.284 [2024-11-17 04:25:37.994368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:59.284 [2024-11-17 04:25:37.994397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:59.284 [2024-11-17 04:25:37.994412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:59.284 #18 NEW cov: 12446 ft: 14008 corp: 5/72b lim: 25 exec/s: 0 rss: 72Mb L: 22/25 MS: 4 CrossOver-CopyPart-ChangeByte-InsertRepeatedBytes- 00:08:59.284 [2024-11-17 04:25:38.084388] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.284 [2024-11-17 04:25:38.084417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.284 [2024-11-17 04:25:38.084464] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.284 [2024-11-17 04:25:38.084482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.545 #19 NEW cov: 12446 ft: 14088 corp: 6/84b lim: 25 exec/s: 0 rss: 72Mb L: 12/25 MS: 1 ChangeASCIIInt- 00:08:59.545 [2024-11-17 04:25:38.174644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.545 [2024-11-17 04:25:38.174673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.545 [2024-11-17 04:25:38.174729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.545 [2024-11-17 04:25:38.174747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.545 #20 NEW cov: 12446 ft: 14182 corp: 7/97b lim: 25 exec/s: 0 rss: 72Mb L: 13/25 MS: 1 InsertByte- 00:08:59.545 [2024-11-17 04:25:38.234813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.545 [2024-11-17 04:25:38.234853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.545 [2024-11-17 04:25:38.234901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.545 [2024-11-17 04:25:38.234918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.545 #21 NEW cov: 12446 ft: 14230 corp: 8/109b lim: 25 exec/s: 0 rss: 72Mb L: 12/25 MS: 1 CopyPart- 00:08:59.545 [2024-11-17 04:25:38.284896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.545 [2024-11-17 04:25:38.284924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.545 [2024-11-17 04:25:38.284972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.545 [2024-11-17 04:25:38.284989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.545 #22 NEW cov: 12446 ft: 14267 corp: 9/122b lim: 25 exec/s: 0 rss: 72Mb L: 13/25 MS: 1 ShuffleBytes- 00:08:59.804 [2024-11-17 04:25:38.375363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.804 [2024-11-17 04:25:38.375394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.804 [2024-11-17 04:25:38.375426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.804 [2024-11-17 04:25:38.375444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.804 [2024-11-17 04:25:38.375475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:59.804 [2024-11-17 04:25:38.375492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:59.804 [2024-11-17 04:25:38.375521] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:59.804 [2024-11-17 04:25:38.375537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:59.804 [2024-11-17 04:25:38.375566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:59.804 [2024-11-17 04:25:38.375582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:59.805 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:59.805 #23 NEW cov: 12469 ft: 14360 corp: 10/147b lim: 25 exec/s: 0 rss: 72Mb L: 25/25 MS: 1 CopyPart- 00:08:59.805 [2024-11-17 04:25:38.435345] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.805 [2024-11-17 04:25:38.435375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.805 [2024-11-17 04:25:38.435409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.805 [2024-11-17 04:25:38.435427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.805 #24 NEW cov: 12469 ft: 14403 corp: 11/160b lim: 25 exec/s: 24 rss: 72Mb L: 13/25 MS: 1 ShuffleBytes- 00:08:59.805 [2024-11-17 04:25:38.485449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.805 [2024-11-17 04:25:38.485478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.805 [2024-11-17 04:25:38.485526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.805 [2024-11-17 04:25:38.485544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.805 #25 NEW cov: 12469 ft: 14422 corp: 12/172b lim: 25 exec/s: 25 rss: 72Mb L: 12/25 MS: 1 ChangeBit- 00:08:59.805 [2024-11-17 04:25:38.575724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.805 [2024-11-17 04:25:38.575755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.805 [2024-11-17 04:25:38.575791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.805 [2024-11-17 04:25:38.575808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:00.064 #26 NEW cov: 12469 ft: 14489 corp: 13/186b lim: 25 exec/s: 26 rss: 72Mb L: 14/25 MS: 1 InsertByte- 00:09:00.064 [2024-11-17 04:25:38.666106] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:00.064 [2024-11-17 04:25:38.666136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:00.064 [2024-11-17 04:25:38.666183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:00.064 [2024-11-17 04:25:38.666201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:00.064 #27 NEW cov: 12469 ft: 14521 corp: 14/198b lim: 25 exec/s: 27 rss: 73Mb L: 12/25 MS: 1 ChangeBinInt- 00:09:00.064 [2024-11-17 04:25:38.756400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:00.064 [2024-11-17 04:25:38.756431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:00.064 [2024-11-17 04:25:38.756478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:00.064 [2024-11-17 04:25:38.756495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:00.064 [2024-11-17 04:25:38.756525] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:00.064 [2024-11-17 04:25:38.756542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:00.064 [2024-11-17 04:25:38.756571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:00.064 [2024-11-17 04:25:38.756588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:00.064 #28 NEW cov: 12469 ft: 14567 corp: 15/220b lim: 25 exec/s: 28 rss: 73Mb L: 22/25 MS: 1 ChangeBinInt- 00:09:00.064 [2024-11-17 04:25:38.846622] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:00.064 [2024-11-17 04:25:38.846653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:00.064 [2024-11-17 04:25:38.846685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:00.064 [2024-11-17 04:25:38.846709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:00.064 [2024-11-17 04:25:38.846756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:00.064 [2024-11-17 04:25:38.846773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:00.324 #29 NEW cov: 12469 ft: 14787 corp: 16/238b lim: 25 exec/s: 29 rss: 73Mb L: 18/25 MS: 1 CrossOver- 00:09:00.324 [2024-11-17 04:25:38.936923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:00.324 [2024-11-17 04:25:38.936952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:00.324 [2024-11-17 04:25:38.936982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:00.324 [2024-11-17 04:25:38.937000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:00.324 [2024-11-17 04:25:38.937029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:00.324 [2024-11-17 04:25:38.937046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:00.324 [2024-11-17 04:25:38.937078] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:00.324 [2024-11-17 04:25:38.937093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:00.324 [2024-11-17 04:25:38.937121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:09:00.324 [2024-11-17 04:25:38.937137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:00.324 #30 NEW cov: 12469 ft: 14805 corp: 17/263b lim: 25 exec/s: 30 rss: 73Mb L: 25/25 MS: 1 CopyPart- 00:09:00.324 [2024-11-17 04:25:38.987043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:00.324 [2024-11-17 04:25:38.987073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:00.324 [2024-11-17 04:25:38.987104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:00.324 [2024-11-17 04:25:38.987121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:00.324 [2024-11-17 04:25:38.987152] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:00.324 [2024-11-17 04:25:38.987168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:00.324 [2024-11-17 04:25:38.987196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:00.324 [2024-11-17 04:25:38.987211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:00.324 [2024-11-17 04:25:38.987239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:09:00.324 [2024-11-17 04:25:38.987255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:00.324 #31 NEW cov: 12469 ft: 14840 corp: 18/288b lim: 25 exec/s: 31 rss: 73Mb L: 25/25 MS: 1 CopyPart- 00:09:00.324 [2024-11-17 04:25:39.077253] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:00.324 [2024-11-17 04:25:39.077282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:00.324 [2024-11-17 04:25:39.077328] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:00.324 [2024-11-17 04:25:39.077345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:00.324 [2024-11-17 04:25:39.077375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:00.324 [2024-11-17 04:25:39.077391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:00.324 [2024-11-17 04:25:39.077419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:00.324 [2024-11-17 04:25:39.077435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:00.324 [2024-11-17 04:25:39.077463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:09:00.324 [2024-11-17 04:25:39.077479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:00.325 #32 NEW cov: 12469 ft: 14850 corp: 19/313b lim: 25 exec/s: 32 rss: 73Mb L: 25/25 MS: 1 ShuffleBytes- 00:09:00.325 [2024-11-17 04:25:39.127276] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:00.325 [2024-11-17 04:25:39.127309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:00.325 [2024-11-17 04:25:39.127359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:00.325 [2024-11-17 04:25:39.127377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:00.585 #33 NEW cov: 12469 ft: 14868 corp: 20/326b lim: 25 exec/s: 33 rss: 73Mb L: 13/25 MS: 1 ChangeBit- 00:09:00.585 [2024-11-17 04:25:39.217584] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:00.585 [2024-11-17 04:25:39.217613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:00.585 [2024-11-17 04:25:39.217660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:00.585 [2024-11-17 04:25:39.217677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:00.585 [2024-11-17 04:25:39.217714] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:00.585 [2024-11-17 04:25:39.217731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:00.585 [2024-11-17 04:25:39.217759] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:00.585 [2024-11-17 04:25:39.217775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:00.585 #34 NEW cov: 12469 ft: 14903 corp: 21/348b lim: 25 exec/s: 34 rss: 73Mb L: 22/25 MS: 1 ChangeBit- 00:09:00.585 [2024-11-17 04:25:39.277629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:00.585 [2024-11-17 04:25:39.277658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:00.585 [2024-11-17 04:25:39.277712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:00.585 [2024-11-17 04:25:39.277730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:00.585 #35 NEW cov: 12469 ft: 14913 corp: 22/360b lim: 25 exec/s: 35 rss: 73Mb L: 12/25 MS: 1 CMP- DE: "\376\377\377\377"- 00:09:00.585 [2024-11-17 04:25:39.327928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:00.585 [2024-11-17 04:25:39.327957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:00.585 [2024-11-17 04:25:39.327988] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:00.585 [2024-11-17 04:25:39.328006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:00.585 [2024-11-17 04:25:39.328036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:00.585 [2024-11-17 04:25:39.328052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:00.585 [2024-11-17 04:25:39.328095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:00.585 [2024-11-17 04:25:39.328111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:00.585 [2024-11-17 04:25:39.328140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:09:00.585 [2024-11-17 04:25:39.328156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:00.585 #36 NEW cov: 12469 ft: 14958 corp: 23/385b lim: 25 exec/s: 36 rss: 73Mb L: 25/25 MS: 1 ChangeBinInt- 00:09:00.845 [2024-11-17 04:25:39.418202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:00.845 [2024-11-17 04:25:39.418233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:00.845 [2024-11-17 04:25:39.418264] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:00.845 [2024-11-17 04:25:39.418283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:00.845 [2024-11-17 04:25:39.418313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:00.845 [2024-11-17 04:25:39.418330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:00.845 [2024-11-17 04:25:39.418359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:00.845 [2024-11-17 04:25:39.418375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:00.845 [2024-11-17 04:25:39.418404] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:09:00.845 [2024-11-17 04:25:39.418419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:00.845 #37 NEW cov: 12469 ft: 15000 corp: 24/410b lim: 25 exec/s: 37 rss: 73Mb L: 25/25 MS: 1 CrossOver- 00:09:00.845 [2024-11-17 04:25:39.478161] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:00.845 [2024-11-17 04:25:39.478191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:00.845 [2024-11-17 04:25:39.478238] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:00.845 [2024-11-17 04:25:39.478255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:00.845 #38 NEW cov: 12469 ft: 15042 corp: 25/422b lim: 25 exec/s: 19 rss: 73Mb L: 12/25 MS: 1 PersAutoDict- DE: "\376\377\377\377"- 00:09:00.845 #38 DONE cov: 12469 ft: 15042 corp: 25/422b lim: 25 exec/s: 19 rss: 73Mb 00:09:00.845 ###### Recommended dictionary. ###### 00:09:00.845 "\376\377\377\377" # Uses: 1 00:09:00.845 ###### End of recommended dictionary. ###### 00:09:00.845 Done 38 runs in 2 second(s) 00:09:00.845 04:25:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_23.conf /var/tmp/suppress_nvmf_fuzz 00:09:00.845 04:25:39 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:00.845 04:25:39 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:00.845 04:25:39 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:09:00.845 04:25:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:09:00.845 04:25:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:09:00.845 04:25:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:09:00.845 04:25:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:09:00.845 04:25:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:09:00.845 04:25:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:00.845 04:25:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:00.845 04:25:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 24 00:09:00.845 04:25:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4424 00:09:00.845 04:25:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:09:00.845 04:25:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:09:00.845 04:25:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:00.845 04:25:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:00.845 04:25:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:00.845 04:25:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 00:09:01.105 [2024-11-17 04:25:39.694687] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:09:01.105 [2024-11-17 04:25:39.694775] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid159056 ] 00:09:01.105 [2024-11-17 04:25:39.910220] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:01.105 [2024-11-17 04:25:39.922927] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:01.366 [2024-11-17 04:25:39.975459] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:01.366 [2024-11-17 04:25:39.991819] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:09:01.366 INFO: Running with entropic power schedule (0xFF, 100). 00:09:01.366 INFO: Seed: 1461887674 00:09:01.366 INFO: Loaded 1 modules (387604 inline 8-bit counters): 387604 [0x2a8d80c, 0x2aec220), 00:09:01.366 INFO: Loaded 1 PC tables (387604 PCs): 387604 [0x2aec220,0x30d6360), 00:09:01.366 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:09:01.366 INFO: A corpus is not provided, starting from an empty corpus 00:09:01.366 #2 INITED exec/s: 0 rss: 64Mb 00:09:01.366 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:01.366 This may also happen if the target rejected all inputs we tried so far 00:09:01.366 [2024-11-17 04:25:40.057106] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.366 [2024-11-17 04:25:40.057136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.625 NEW_FUNC[1/715]: 0x47e2e8 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:09:01.625 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:01.625 #10 NEW cov: 12293 ft: 12295 corp: 2/31b lim: 100 exec/s: 0 rss: 71Mb L: 30/30 MS: 3 ChangeBit-CopyPart-InsertRepeatedBytes- 00:09:01.625 [2024-11-17 04:25:40.398107] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.625 [2024-11-17 04:25:40.398170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.625 NEW_FUNC[1/2]: 0xfb3cd8 in rte_get_timer_cycles /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic/rte_cycles.h:94 00:09:01.625 NEW_FUNC[2/2]: 0x17b9398 in nvme_qpair_get_state /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1569 00:09:01.625 #11 NEW cov: 12426 ft: 12996 corp: 3/61b lim: 100 exec/s: 0 rss: 71Mb L: 30/30 MS: 1 ChangeBinInt- 00:09:01.885 [2024-11-17 04:25:40.468118] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.885 [2024-11-17 04:25:40.468151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.885 #12 NEW cov: 12432 ft: 13182 corp: 4/91b lim: 100 exec/s: 0 rss: 71Mb L: 30/30 MS: 1 CrossOver- 00:09:01.885 [2024-11-17 04:25:40.508470] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.885 [2024-11-17 04:25:40.508498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.885 [2024-11-17 04:25:40.508551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.885 [2024-11-17 04:25:40.508566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:01.885 [2024-11-17 04:25:40.508622] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.885 [2024-11-17 04:25:40.508638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:01.885 #17 NEW cov: 12517 ft: 14229 corp: 5/157b lim: 100 exec/s: 0 rss: 71Mb L: 66/66 MS: 5 CrossOver-CrossOver-EraseBytes-CrossOver-InsertRepeatedBytes- 00:09:01.885 [2024-11-17 04:25:40.548244] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2965947085841049897 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.885 [2024-11-17 04:25:40.548271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.885 #18 NEW cov: 12517 ft: 14352 corp: 6/188b lim: 100 exec/s: 0 rss: 71Mb L: 31/66 MS: 1 CrossOver- 00:09:01.885 [2024-11-17 04:25:40.588343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2965941588803004713 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.885 [2024-11-17 04:25:40.588371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.885 #24 NEW cov: 12517 ft: 14518 corp: 7/218b lim: 100 exec/s: 0 rss: 72Mb L: 30/66 MS: 1 ChangeBinInt- 00:09:01.885 [2024-11-17 04:25:40.648509] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.885 [2024-11-17 04:25:40.648537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.885 #25 NEW cov: 12517 ft: 14612 corp: 8/251b lim: 100 exec/s: 0 rss: 72Mb L: 33/66 MS: 1 InsertRepeatedBytes- 00:09:01.885 [2024-11-17 04:25:40.688652] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.885 [2024-11-17 04:25:40.688678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.885 #26 NEW cov: 12517 ft: 14657 corp: 9/281b lim: 100 exec/s: 0 rss: 72Mb L: 30/66 MS: 1 ShuffleBytes- 00:09:02.145 [2024-11-17 04:25:40.728737] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2173313551943936297 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.145 [2024-11-17 04:25:40.728763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.145 #27 NEW cov: 12517 ft: 14676 corp: 10/311b lim: 100 exec/s: 0 rss: 72Mb L: 30/66 MS: 1 ChangeBinInt- 00:09:02.145 [2024-11-17 04:25:40.788910] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2173313551943936297 len:10585 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.145 [2024-11-17 04:25:40.788937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.145 #28 NEW cov: 12517 ft: 14751 corp: 11/341b lim: 100 exec/s: 0 rss: 72Mb L: 30/66 MS: 1 ChangeByte- 00:09:02.145 [2024-11-17 04:25:40.849084] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2173313551943936297 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.145 [2024-11-17 04:25:40.849115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.145 #29 NEW cov: 12517 ft: 14832 corp: 12/371b lim: 100 exec/s: 0 rss: 72Mb L: 30/66 MS: 1 ShuffleBytes- 00:09:02.145 [2024-11-17 04:25:40.889201] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2533601522133575977 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.145 [2024-11-17 04:25:40.889229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.145 #30 NEW cov: 12517 ft: 14893 corp: 13/402b lim: 100 exec/s: 0 rss: 72Mb L: 31/66 MS: 1 InsertByte- 00:09:02.145 [2024-11-17 04:25:40.929309] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2965947085841049897 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.145 [2024-11-17 04:25:40.929336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.145 NEW_FUNC[1/1]: 0x1c4e498 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:02.145 #31 NEW cov: 12540 ft: 14937 corp: 14/433b lim: 100 exec/s: 0 rss: 72Mb L: 31/66 MS: 1 ChangeByte- 00:09:02.405 [2024-11-17 04:25:40.989492] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2173313551943936297 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.405 [2024-11-17 04:25:40.989519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.405 #32 NEW cov: 12540 ft: 14957 corp: 15/463b lim: 100 exec/s: 0 rss: 72Mb L: 30/66 MS: 1 ChangeBit- 00:09:02.405 [2024-11-17 04:25:41.049677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2173313551943936297 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.405 [2024-11-17 04:25:41.049710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.405 #33 NEW cov: 12540 ft: 14992 corp: 16/493b lim: 100 exec/s: 33 rss: 72Mb L: 30/66 MS: 1 ChangeByte- 00:09:02.405 [2024-11-17 04:25:41.090074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.405 [2024-11-17 04:25:41.090101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.405 [2024-11-17 04:25:41.090162] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.405 [2024-11-17 04:25:41.090178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:02.405 [2024-11-17 04:25:41.090235] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.405 [2024-11-17 04:25:41.090250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:02.405 #34 NEW cov: 12540 ft: 15004 corp: 17/557b lim: 100 exec/s: 34 rss: 72Mb L: 64/66 MS: 1 EraseBytes- 00:09:02.405 [2024-11-17 04:25:41.149939] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2173313551943936297 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.405 [2024-11-17 04:25:41.149966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.405 #35 NEW cov: 12540 ft: 15051 corp: 18/588b lim: 100 exec/s: 35 rss: 72Mb L: 31/66 MS: 1 InsertByte- 00:09:02.405 [2024-11-17 04:25:41.190064] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2965947085841049897 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.405 [2024-11-17 04:25:41.190091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.405 #36 NEW cov: 12540 ft: 15084 corp: 19/619b lim: 100 exec/s: 36 rss: 72Mb L: 31/66 MS: 1 ChangeByte- 00:09:02.665 [2024-11-17 04:25:41.250237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2173337741199747369 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.665 [2024-11-17 04:25:41.250265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.665 #37 NEW cov: 12540 ft: 15093 corp: 20/651b lim: 100 exec/s: 37 rss: 72Mb L: 32/66 MS: 1 InsertByte- 00:09:02.665 [2024-11-17 04:25:41.310687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2173313551943936297 len:10744 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.665 [2024-11-17 04:25:41.310718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.666 [2024-11-17 04:25:41.310781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.666 [2024-11-17 04:25:41.310798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:02.666 [2024-11-17 04:25:41.310855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.666 [2024-11-17 04:25:41.310870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:02.666 #41 NEW cov: 12540 ft: 15105 corp: 21/730b lim: 100 exec/s: 41 rss: 73Mb L: 79/79 MS: 4 EraseBytes-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:09:02.666 [2024-11-17 04:25:41.370743] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.666 [2024-11-17 04:25:41.370772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.666 [2024-11-17 04:25:41.370813] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13744632836734762686 len:48831 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.666 [2024-11-17 04:25:41.370829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:02.666 #42 NEW cov: 12540 ft: 15435 corp: 22/773b lim: 100 exec/s: 42 rss: 73Mb L: 43/79 MS: 1 InsertRepeatedBytes- 00:09:02.666 [2024-11-17 04:25:41.410676] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.666 [2024-11-17 04:25:41.410709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.666 #43 NEW cov: 12540 ft: 15454 corp: 23/804b lim: 100 exec/s: 43 rss: 73Mb L: 31/79 MS: 1 InsertByte- 00:09:02.666 [2024-11-17 04:25:41.450806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2966098818965776681 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.666 [2024-11-17 04:25:41.450834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.666 #44 NEW cov: 12540 ft: 15540 corp: 24/835b lim: 100 exec/s: 44 rss: 73Mb L: 31/79 MS: 1 ChangeByte- 00:09:02.926 [2024-11-17 04:25:41.511269] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2173313551943936297 len:10744 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.926 [2024-11-17 04:25:41.511295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.926 [2024-11-17 04:25:41.511341] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.926 [2024-11-17 04:25:41.511358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:02.926 [2024-11-17 04:25:41.511415] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.926 [2024-11-17 04:25:41.511430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:02.926 #45 NEW cov: 12540 ft: 15549 corp: 25/914b lim: 100 exec/s: 45 rss: 73Mb L: 79/79 MS: 1 ChangeBinInt- 00:09:02.926 [2024-11-17 04:25:41.571167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.926 [2024-11-17 04:25:41.571195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.926 #46 NEW cov: 12540 ft: 15614 corp: 26/944b lim: 100 exec/s: 46 rss: 73Mb L: 30/79 MS: 1 ChangeByte- 00:09:02.926 [2024-11-17 04:25:41.611752] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2173313551943936297 len:10744 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.926 [2024-11-17 04:25:41.611779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.926 [2024-11-17 04:25:41.611834] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.926 [2024-11-17 04:25:41.611850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:02.926 [2024-11-17 04:25:41.611904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.926 [2024-11-17 04:25:41.611919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:02.926 [2024-11-17 04:25:41.611974] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.926 [2024-11-17 04:25:41.611990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:02.926 #47 NEW cov: 12540 ft: 15980 corp: 27/1024b lim: 100 exec/s: 47 rss: 73Mb L: 80/80 MS: 1 InsertByte- 00:09:02.926 [2024-11-17 04:25:41.651837] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2173313551943936297 len:10744 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.926 [2024-11-17 04:25:41.651865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.926 [2024-11-17 04:25:41.651939] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.926 [2024-11-17 04:25:41.651955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:02.926 [2024-11-17 04:25:41.652010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.926 [2024-11-17 04:25:41.652024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:02.926 [2024-11-17 04:25:41.652079] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.926 [2024-11-17 04:25:41.652092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:02.926 #48 NEW cov: 12540 ft: 16023 corp: 28/1104b lim: 100 exec/s: 48 rss: 73Mb L: 80/80 MS: 1 ChangeBit- 00:09:02.926 [2024-11-17 04:25:41.711554] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12185952598997633654 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.926 [2024-11-17 04:25:41.711585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.926 #49 NEW cov: 12540 ft: 16032 corp: 29/1134b lim: 100 exec/s: 49 rss: 73Mb L: 30/80 MS: 1 CMP- DE: "\000\212v\251\0353S\016"- 00:09:02.926 [2024-11-17 04:25:41.752214] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2173313551943936297 len:258 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.927 [2024-11-17 04:25:41.752241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.927 [2024-11-17 04:25:41.752289] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:72340172838076673 len:63488 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.927 [2024-11-17 04:25:41.752304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:02.927 [2024-11-17 04:25:41.752359] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.927 [2024-11-17 04:25:41.752373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:02.927 [2024-11-17 04:25:41.752429] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.927 [2024-11-17 04:25:41.752445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:03.187 #50 NEW cov: 12540 ft: 16065 corp: 30/1232b lim: 100 exec/s: 50 rss: 73Mb L: 98/98 MS: 1 InsertRepeatedBytes- 00:09:03.187 [2024-11-17 04:25:41.792126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2173313551943936297 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.187 [2024-11-17 04:25:41.792152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:03.187 [2024-11-17 04:25:41.792215] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4557430888428284201 len:16192 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.187 [2024-11-17 04:25:41.792233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:03.187 [2024-11-17 04:25:41.792289] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4557430888798830399 len:16192 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.187 [2024-11-17 04:25:41.792305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:03.187 #51 NEW cov: 12540 ft: 16077 corp: 31/1297b lim: 100 exec/s: 51 rss: 73Mb L: 65/98 MS: 1 InsertRepeatedBytes- 00:09:03.187 [2024-11-17 04:25:41.852272] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2173313551943936297 len:10744 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.187 [2024-11-17 04:25:41.852298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:03.187 [2024-11-17 04:25:41.852359] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.187 [2024-11-17 04:25:41.852375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:03.187 [2024-11-17 04:25:41.852431] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.187 [2024-11-17 04:25:41.852446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:03.187 #52 NEW cov: 12540 ft: 16088 corp: 32/1376b lim: 100 exec/s: 52 rss: 73Mb L: 79/98 MS: 1 ChangeBit- 00:09:03.187 [2024-11-17 04:25:41.912456] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.187 [2024-11-17 04:25:41.912482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:03.187 [2024-11-17 04:25:41.912520] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:2387225703791271977 len:8482 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.187 [2024-11-17 04:25:41.912536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:03.187 [2024-11-17 04:25:41.912591] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:2387225703656530209 len:8482 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.187 [2024-11-17 04:25:41.912606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:03.187 #53 NEW cov: 12540 ft: 16131 corp: 33/1443b lim: 100 exec/s: 53 rss: 73Mb L: 67/98 MS: 1 InsertRepeatedBytes- 00:09:03.187 [2024-11-17 04:25:41.972767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.187 [2024-11-17 04:25:41.972794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:03.187 [2024-11-17 04:25:41.972843] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.187 [2024-11-17 04:25:41.972859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:03.187 [2024-11-17 04:25:41.972929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6365935209750747224 len:22617 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.187 [2024-11-17 04:25:41.972943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:03.187 [2024-11-17 04:25:41.972997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.187 [2024-11-17 04:25:41.973012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:03.448 #54 NEW cov: 12540 ft: 16166 corp: 34/1529b lim: 100 exec/s: 54 rss: 73Mb L: 86/98 MS: 1 InsertRepeatedBytes- 00:09:03.448 [2024-11-17 04:25:42.032778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2173313551943936297 len:10744 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.448 [2024-11-17 04:25:42.032805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:03.448 [2024-11-17 04:25:42.032844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744069904487182 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.448 [2024-11-17 04:25:42.032859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:03.448 [2024-11-17 04:25:42.032915] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.448 [2024-11-17 04:25:42.032931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:03.448 #55 NEW cov: 12540 ft: 16224 corp: 35/1608b lim: 100 exec/s: 27 rss: 73Mb L: 79/98 MS: 1 PersAutoDict- DE: "\000\212v\251\0353S\016"- 00:09:03.448 #55 DONE cov: 12540 ft: 16224 corp: 35/1608b lim: 100 exec/s: 27 rss: 73Mb 00:09:03.448 ###### Recommended dictionary. ###### 00:09:03.448 "\000\212v\251\0353S\016" # Uses: 1 00:09:03.448 ###### End of recommended dictionary. ###### 00:09:03.448 Done 55 runs in 2 second(s) 00:09:03.448 04:25:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_24.conf /var/tmp/suppress_nvmf_fuzz 00:09:03.448 04:25:42 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:03.448 04:25:42 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:03.448 04:25:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@79 -- # trap - SIGINT SIGTERM EXIT 00:09:03.448 00:09:03.448 real 1m3.080s 00:09:03.448 user 1m39.310s 00:09:03.448 sys 0m7.656s 00:09:03.448 04:25:42 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:03.448 04:25:42 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:03.448 ************************************ 00:09:03.448 END TEST nvmf_llvm_fuzz 00:09:03.448 ************************************ 00:09:03.448 04:25:42 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:09:03.448 04:25:42 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:09:03.448 04:25:42 llvm_fuzz -- fuzz/llvm.sh@20 -- # run_test vfio_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:09:03.448 04:25:42 llvm_fuzz -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:09:03.448 04:25:42 llvm_fuzz -- common/autotest_common.sh@1111 -- # xtrace_disable 00:09:03.448 04:25:42 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:03.448 ************************************ 00:09:03.448 START TEST vfio_llvm_fuzz 00:09:03.448 ************************************ 00:09:03.448 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:09:03.711 * Looking for test storage... 00:09:03.711 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:09:03.711 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:03.711 --rc genhtml_branch_coverage=1 00:09:03.711 --rc genhtml_function_coverage=1 00:09:03.711 --rc genhtml_legend=1 00:09:03.711 --rc geninfo_all_blocks=1 00:09:03.711 --rc geninfo_unexecuted_blocks=1 00:09:03.711 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:03.711 ' 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:09:03.711 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:03.711 --rc genhtml_branch_coverage=1 00:09:03.711 --rc genhtml_function_coverage=1 00:09:03.711 --rc genhtml_legend=1 00:09:03.711 --rc geninfo_all_blocks=1 00:09:03.711 --rc geninfo_unexecuted_blocks=1 00:09:03.711 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:03.711 ' 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:09:03.711 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:03.711 --rc genhtml_branch_coverage=1 00:09:03.711 --rc genhtml_function_coverage=1 00:09:03.711 --rc genhtml_legend=1 00:09:03.711 --rc geninfo_all_blocks=1 00:09:03.711 --rc geninfo_unexecuted_blocks=1 00:09:03.711 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:03.711 ' 00:09:03.711 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:09:03.711 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:03.711 --rc genhtml_branch_coverage=1 00:09:03.711 --rc genhtml_function_coverage=1 00:09:03.711 --rc genhtml_legend=1 00:09:03.711 --rc geninfo_all_blocks=1 00:09:03.711 --rc geninfo_unexecuted_blocks=1 00:09:03.711 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:03.711 ' 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@64 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_MAX_NUMA_NODES=1 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_PGO_CAPTURE=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_LTO=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_ISCSI_INITIATOR=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_CET=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_OCF_PATH= 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_RDMA_SET_TOS=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_AIO_FSDEV=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_ARC4RANDOM=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_HAVE_LIBARCHIVE=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_UBLK=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_ISAL_CRYPTO=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OPENSSL_PATH= 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_OCF=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_FUSE=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_VTUNE_DIR= 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FUZZER=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_FSDEV=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_CRYPTO=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_PGO_USE=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_VHOST=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DAOS=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_DAOS_DIR= 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_UNIT_TESTS=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_VIRTIO=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_DPDK_UADK=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_COVERAGE=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_RDMA=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_HAVE_LZ4=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_URING_PATH= 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_XNVME=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_VFIO_USER=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_ARCH=native 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_HAVE_EVP_MAC=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_URING_ZNS=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_WERROR=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_HAVE_LIBBSD=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_UBSAN=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_IPSEC_MB_DIR= 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_GOLANG=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_ISAL=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_IDXD_KERNEL=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_RDMA_PROV=verbs 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_APPS=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_SHARED=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_HAVE_KEYUTILS=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_FC_PATH= 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_DPDK_PKG_CONFIG=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_FC=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_AVAHI=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_FIO_PLUGIN=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_RAID5F=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_EXAMPLES=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_TESTS=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_CRYPTO_MLX5=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_MAX_LCORES=128 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_IPSEC_MB=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_PGO_DIR= 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DEBUG=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_DPDK_COMPRESSDEV=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_CROSS_PREFIX= 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_COPY_FILE_RANGE=y 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@90 -- # CONFIG_URING=n 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:09:03.712 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:09:03.713 #define SPDK_CONFIG_H 00:09:03.713 #define SPDK_CONFIG_AIO_FSDEV 1 00:09:03.713 #define SPDK_CONFIG_APPS 1 00:09:03.713 #define SPDK_CONFIG_ARCH native 00:09:03.713 #undef SPDK_CONFIG_ASAN 00:09:03.713 #undef SPDK_CONFIG_AVAHI 00:09:03.713 #undef SPDK_CONFIG_CET 00:09:03.713 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:09:03.713 #define SPDK_CONFIG_COVERAGE 1 00:09:03.713 #define SPDK_CONFIG_CROSS_PREFIX 00:09:03.713 #undef SPDK_CONFIG_CRYPTO 00:09:03.713 #undef SPDK_CONFIG_CRYPTO_MLX5 00:09:03.713 #undef SPDK_CONFIG_CUSTOMOCF 00:09:03.713 #undef SPDK_CONFIG_DAOS 00:09:03.713 #define SPDK_CONFIG_DAOS_DIR 00:09:03.713 #define SPDK_CONFIG_DEBUG 1 00:09:03.713 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:09:03.713 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:03.713 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:09:03.713 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:03.713 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:09:03.713 #undef SPDK_CONFIG_DPDK_UADK 00:09:03.713 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:09:03.713 #define SPDK_CONFIG_EXAMPLES 1 00:09:03.713 #undef SPDK_CONFIG_FC 00:09:03.713 #define SPDK_CONFIG_FC_PATH 00:09:03.713 #define SPDK_CONFIG_FIO_PLUGIN 1 00:09:03.713 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:09:03.713 #define SPDK_CONFIG_FSDEV 1 00:09:03.713 #undef SPDK_CONFIG_FUSE 00:09:03.713 #define SPDK_CONFIG_FUZZER 1 00:09:03.713 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:09:03.713 #undef SPDK_CONFIG_GOLANG 00:09:03.713 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:09:03.713 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:09:03.713 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:09:03.713 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:09:03.713 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:09:03.713 #undef SPDK_CONFIG_HAVE_LIBBSD 00:09:03.713 #undef SPDK_CONFIG_HAVE_LZ4 00:09:03.713 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:09:03.713 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:09:03.713 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:09:03.713 #define SPDK_CONFIG_IDXD 1 00:09:03.713 #define SPDK_CONFIG_IDXD_KERNEL 1 00:09:03.713 #undef SPDK_CONFIG_IPSEC_MB 00:09:03.713 #define SPDK_CONFIG_IPSEC_MB_DIR 00:09:03.713 #define SPDK_CONFIG_ISAL 1 00:09:03.713 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:09:03.713 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:09:03.713 #define SPDK_CONFIG_LIBDIR 00:09:03.713 #undef SPDK_CONFIG_LTO 00:09:03.713 #define SPDK_CONFIG_MAX_LCORES 128 00:09:03.713 #define SPDK_CONFIG_MAX_NUMA_NODES 1 00:09:03.713 #define SPDK_CONFIG_NVME_CUSE 1 00:09:03.713 #undef SPDK_CONFIG_OCF 00:09:03.713 #define SPDK_CONFIG_OCF_PATH 00:09:03.713 #define SPDK_CONFIG_OPENSSL_PATH 00:09:03.713 #undef SPDK_CONFIG_PGO_CAPTURE 00:09:03.713 #define SPDK_CONFIG_PGO_DIR 00:09:03.713 #undef SPDK_CONFIG_PGO_USE 00:09:03.713 #define SPDK_CONFIG_PREFIX /usr/local 00:09:03.713 #undef SPDK_CONFIG_RAID5F 00:09:03.713 #undef SPDK_CONFIG_RBD 00:09:03.713 #define SPDK_CONFIG_RDMA 1 00:09:03.713 #define SPDK_CONFIG_RDMA_PROV verbs 00:09:03.713 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:09:03.713 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:09:03.713 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:09:03.713 #undef SPDK_CONFIG_SHARED 00:09:03.713 #undef SPDK_CONFIG_SMA 00:09:03.713 #define SPDK_CONFIG_TESTS 1 00:09:03.713 #undef SPDK_CONFIG_TSAN 00:09:03.713 #define SPDK_CONFIG_UBLK 1 00:09:03.713 #define SPDK_CONFIG_UBSAN 1 00:09:03.713 #undef SPDK_CONFIG_UNIT_TESTS 00:09:03.713 #undef SPDK_CONFIG_URING 00:09:03.713 #define SPDK_CONFIG_URING_PATH 00:09:03.713 #undef SPDK_CONFIG_URING_ZNS 00:09:03.713 #undef SPDK_CONFIG_USDT 00:09:03.713 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:09:03.713 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:09:03.713 #define SPDK_CONFIG_VFIO_USER 1 00:09:03.713 #define SPDK_CONFIG_VFIO_USER_DIR 00:09:03.713 #define SPDK_CONFIG_VHOST 1 00:09:03.713 #define SPDK_CONFIG_VIRTIO 1 00:09:03.713 #undef SPDK_CONFIG_VTUNE 00:09:03.713 #define SPDK_CONFIG_VTUNE_DIR 00:09:03.713 #define SPDK_CONFIG_WERROR 1 00:09:03.713 #define SPDK_CONFIG_WPDK_DIR 00:09:03.713 #undef SPDK_CONFIG_XNVME 00:09:03.713 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # uname -s 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:09:03.713 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@140 -- # : v22.11.4 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@177 -- # : 0 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@178 -- # export SPDK_TEST_NVME_INTERRUPT 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@183 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:03.714 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@183 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:03.715 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@184 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:03.715 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@184 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:03.715 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@187 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:09:03.715 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@187 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:09:03.715 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@191 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:09:03.715 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@191 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:09:03.715 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@195 -- # export PYTHONDONTWRITEBYTECODE=1 00:09:03.715 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@195 -- # PYTHONDONTWRITEBYTECODE=1 00:09:03.715 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@199 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:09:03.715 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@199 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:09:03.715 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@200 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:09:03.715 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@200 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:09:03.715 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@204 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:09:03.715 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@205 -- # rm -rf /var/tmp/asan_suppression_file 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@206 -- # cat 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@242 -- # echo leak:libfuse3.so 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@248 -- # '[' -z /var/spdk/dependencies ']' 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@251 -- # export DEPENDENCY_DIR 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@255 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@255 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@256 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@256 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@259 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@259 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@262 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@262 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV_MAIN=0 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV_LLVM=1 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@269 -- # _LCOV= 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ '' == *clang* ]] 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ 1 -eq 1 ]] 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # _LCOV=1 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@272 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@273 -- # _lcov_opt[_LCOV_MAIN]= 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@275 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@278 -- # '[' 0 -eq 0 ']' 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@279 -- # export valgrind= 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@279 -- # valgrind= 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # uname -s 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # '[' Linux = Linux ']' 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@286 -- # HUGEMEM=4096 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # export CLEAR_HUGE=yes 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # CLEAR_HUGE=yes 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@289 -- # MAKE=make 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@290 -- # MAKEFLAGS=-j112 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # export HUGEMEM=4096 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # HUGEMEM=4096 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@308 -- # NO_HUGE=() 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@309 -- # TEST_MODE= 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@331 -- # [[ -z 159520 ]] 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@331 -- # kill -0 159520 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1678 -- # set_test_storage 2147483648 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@341 -- # [[ -v testdir ]] 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@343 -- # local requested_size=2147483648 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@344 -- # local mount target_dir 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@346 -- # local -A mounts fss sizes avails uses 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@347 -- # local source fs size avail mount use 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@349 -- # local storage_fallback storage_candidates 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@351 -- # mktemp -udt spdk.XXXXXX 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@351 -- # storage_fallback=/tmp/spdk.O7D5Mp 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@356 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@358 -- # [[ -n '' ]] 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@363 -- # [[ -n '' ]] 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@368 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.O7D5Mp/tests/vfio /tmp/spdk.O7D5Mp 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # requested_size=2214592512 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@340 -- # df -T 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@340 -- # grep -v Filesystem 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_devtmpfs 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=devtmpfs 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=67108864 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=67108864 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=0 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/pmem0 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=ext2 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=4096 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=5284429824 00:09:03.976 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5284425728 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_root 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=overlay 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=53092089856 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=61730594816 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=8638504960 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30861869056 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865297408 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=3428352 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=12340125696 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=12346122240 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5996544 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30865121280 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865297408 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=176128 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=6173044736 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=6173057024 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=12288 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@379 -- # printf '* Looking for test storage...\n' 00:09:03.977 * Looking for test storage... 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@381 -- # local target_space new_size 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@382 -- # for target_dir in "${storage_candidates[@]}" 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # awk '$1 !~ /Filesystem/{print $6}' 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # mount=/ 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@387 -- # target_space=53092089856 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@388 -- # (( target_space == 0 || target_space < requested_size )) 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # (( target_space >= requested_size )) 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == tmpfs ]] 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == ramfs ]] 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ / == / ]] 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@394 -- # new_size=10853097472 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@395 -- # (( new_size * 100 / sizes[/] > 95 )) 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@401 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:03.977 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@402 -- # return 0 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1680 -- # set -o errtrace 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # shopt -s extdebug 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1682 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1684 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1685 -- # true 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1687 -- # xtrace_fd 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:09:03.977 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:03.977 --rc genhtml_branch_coverage=1 00:09:03.977 --rc genhtml_function_coverage=1 00:09:03.977 --rc genhtml_legend=1 00:09:03.977 --rc geninfo_all_blocks=1 00:09:03.977 --rc geninfo_unexecuted_blocks=1 00:09:03.977 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:03.977 ' 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:09:03.977 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:03.977 --rc genhtml_branch_coverage=1 00:09:03.977 --rc genhtml_function_coverage=1 00:09:03.977 --rc genhtml_legend=1 00:09:03.977 --rc geninfo_all_blocks=1 00:09:03.977 --rc geninfo_unexecuted_blocks=1 00:09:03.977 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:03.977 ' 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:09:03.977 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:03.977 --rc genhtml_branch_coverage=1 00:09:03.977 --rc genhtml_function_coverage=1 00:09:03.977 --rc genhtml_legend=1 00:09:03.977 --rc geninfo_all_blocks=1 00:09:03.977 --rc geninfo_unexecuted_blocks=1 00:09:03.977 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:03.977 ' 00:09:03.977 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:09:03.977 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:03.977 --rc genhtml_branch_coverage=1 00:09:03.977 --rc genhtml_function_coverage=1 00:09:03.978 --rc genhtml_legend=1 00:09:03.978 --rc geninfo_all_blocks=1 00:09:03.978 --rc geninfo_unexecuted_blocks=1 00:09:03.978 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:03.978 ' 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@65 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@67 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # fuzz_num=7 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@69 -- # (( fuzz_num != 0 )) 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@71 -- # trap 'cleanup /tmp/vfio-user-* /var/tmp/suppress_vfio_fuzz; exit 1' SIGINT SIGTERM EXIT 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@74 -- # mem_size=0 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@75 -- # [[ 1 -eq 1 ]] 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@76 -- # start_llvm_fuzz_short 7 1 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=7 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=0 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:09:03.978 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:03.978 04:25:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:09:03.978 [2024-11-17 04:25:42.762606] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:09:03.978 [2024-11-17 04:25:42.762703] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid159656 ] 00:09:04.237 [2024-11-17 04:25:42.858691] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:04.237 [2024-11-17 04:25:42.881606] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:04.237 INFO: Running with entropic power schedule (0xFF, 100). 00:09:04.237 INFO: Seed: 227882652 00:09:04.497 INFO: Loaded 1 modules (384840 inline 8-bit counters): 384840 [0x2a4e00c, 0x2aabf54), 00:09:04.497 INFO: Loaded 1 PC tables (384840 PCs): 384840 [0x2aabf58,0x308b3d8), 00:09:04.497 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:09:04.497 INFO: A corpus is not provided, starting from an empty corpus 00:09:04.497 #2 INITED exec/s: 0 rss: 65Mb 00:09:04.497 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:04.497 This may also happen if the target rejected all inputs we tried so far 00:09:04.497 [2024-11-17 04:25:43.118339] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: enabling controller 00:09:04.757 NEW_FUNC[1/671]: 0x4521a8 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:84 00:09:04.757 NEW_FUNC[2/671]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:04.757 #46 NEW cov: 11163 ft: 11130 corp: 2/7b lim: 6 exec/s: 0 rss: 71Mb L: 6/6 MS: 4 ChangeBit-CopyPart-InsertRepeatedBytes-InsertByte- 00:09:05.017 NEW_FUNC[1/1]: 0x1939868 in nvme_qpair_check_enabled /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:636 00:09:05.017 #47 NEW cov: 11181 ft: 14745 corp: 3/13b lim: 6 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 ChangeBit- 00:09:05.276 NEW_FUNC[1/1]: 0x1c1a8e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:05.276 #49 NEW cov: 11201 ft: 15103 corp: 4/19b lim: 6 exec/s: 0 rss: 73Mb L: 6/6 MS: 2 CrossOver-CMP- DE: "\003\000"- 00:09:05.536 #50 NEW cov: 11201 ft: 16618 corp: 5/25b lim: 6 exec/s: 50 rss: 73Mb L: 6/6 MS: 1 ChangeByte- 00:09:05.536 #55 NEW cov: 11211 ft: 17037 corp: 6/31b lim: 6 exec/s: 55 rss: 73Mb L: 6/6 MS: 5 EraseBytes-ChangeBinInt-EraseBytes-CopyPart-PersAutoDict- DE: "\003\000"- 00:09:05.796 #56 NEW cov: 11211 ft: 17210 corp: 7/37b lim: 6 exec/s: 56 rss: 74Mb L: 6/6 MS: 1 ChangeBit- 00:09:06.055 #62 NEW cov: 11211 ft: 17748 corp: 8/43b lim: 6 exec/s: 62 rss: 74Mb L: 6/6 MS: 1 PersAutoDict- DE: "\003\000"- 00:09:06.315 #63 NEW cov: 11218 ft: 18265 corp: 9/49b lim: 6 exec/s: 63 rss: 74Mb L: 6/6 MS: 1 ShuffleBytes- 00:09:06.315 #64 NEW cov: 11218 ft: 18471 corp: 10/55b lim: 6 exec/s: 64 rss: 74Mb L: 6/6 MS: 1 ChangeBit- 00:09:06.575 #68 NEW cov: 11218 ft: 18503 corp: 11/61b lim: 6 exec/s: 34 rss: 74Mb L: 6/6 MS: 4 EraseBytes-ChangeBinInt-ChangeByte-CopyPart- 00:09:06.575 #68 DONE cov: 11218 ft: 18503 corp: 11/61b lim: 6 exec/s: 34 rss: 74Mb 00:09:06.575 ###### Recommended dictionary. ###### 00:09:06.575 "\003\000" # Uses: 2 00:09:06.575 ###### End of recommended dictionary. ###### 00:09:06.575 Done 68 runs in 2 second(s) 00:09:06.575 [2024-11-17 04:25:45.272876] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: disabling controller 00:09:06.836 04:25:45 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-0 /var/tmp/suppress_vfio_fuzz 00:09:06.836 04:25:45 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:06.836 04:25:45 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:06.836 04:25:45 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:09:06.836 04:25:45 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=1 00:09:06.836 04:25:45 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:06.836 04:25:45 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:06.836 04:25:45 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:06.836 04:25:45 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:09:06.836 04:25:45 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:09:06.836 04:25:45 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:09:06.836 04:25:45 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:09:06.836 04:25:45 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:06.836 04:25:45 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:06.836 04:25:45 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:06.836 04:25:45 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:09:06.836 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:06.836 04:25:45 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:06.836 04:25:45 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:06.836 04:25:45 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:09:06.836 [2024-11-17 04:25:45.530423] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:09:06.836 [2024-11-17 04:25:45.530491] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid160119 ] 00:09:06.836 [2024-11-17 04:25:45.624152] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:06.836 [2024-11-17 04:25:45.646348] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:07.095 INFO: Running with entropic power schedule (0xFF, 100). 00:09:07.095 INFO: Seed: 2990916966 00:09:07.095 INFO: Loaded 1 modules (384840 inline 8-bit counters): 384840 [0x2a4e00c, 0x2aabf54), 00:09:07.095 INFO: Loaded 1 PC tables (384840 PCs): 384840 [0x2aabf58,0x308b3d8), 00:09:07.095 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:07.095 INFO: A corpus is not provided, starting from an empty corpus 00:09:07.095 #2 INITED exec/s: 0 rss: 65Mb 00:09:07.095 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:07.095 This may also happen if the target rejected all inputs we tried so far 00:09:07.095 [2024-11-17 04:25:45.895033] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: enabling controller 00:09:07.355 [2024-11-17 04:25:45.962917] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:07.355 [2024-11-17 04:25:45.962942] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:07.355 [2024-11-17 04:25:45.962978] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:07.614 NEW_FUNC[1/674]: 0x452748 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:71 00:09:07.614 NEW_FUNC[2/674]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:07.614 #46 NEW cov: 11166 ft: 11092 corp: 2/5b lim: 4 exec/s: 0 rss: 71Mb L: 4/4 MS: 4 ChangeBit-ChangeBit-CMP-InsertByte- DE: "\001\005"- 00:09:07.873 [2024-11-17 04:25:46.447976] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:07.873 [2024-11-17 04:25:46.448007] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:07.873 [2024-11-17 04:25:46.448025] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:07.873 #60 NEW cov: 11180 ft: 14391 corp: 3/9b lim: 4 exec/s: 0 rss: 72Mb L: 4/4 MS: 4 PersAutoDict-ChangeByte-CrossOver-CrossOver- DE: "\001\005"- 00:09:07.873 [2024-11-17 04:25:46.659383] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:07.873 [2024-11-17 04:25:46.659406] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:07.874 [2024-11-17 04:25:46.659425] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:08.133 NEW_FUNC[1/1]: 0x1c1a8e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:08.133 #81 NEW cov: 11197 ft: 15990 corp: 4/13b lim: 4 exec/s: 0 rss: 73Mb L: 4/4 MS: 1 ChangeBinInt- 00:09:08.133 [2024-11-17 04:25:46.871088] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:08.133 [2024-11-17 04:25:46.871111] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:08.133 [2024-11-17 04:25:46.871129] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:08.393 #82 NEW cov: 11197 ft: 16886 corp: 5/17b lim: 4 exec/s: 82 rss: 73Mb L: 4/4 MS: 1 ChangeByte- 00:09:08.393 [2024-11-17 04:25:47.063034] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:08.393 [2024-11-17 04:25:47.063057] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:08.393 [2024-11-17 04:25:47.063073] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:08.393 #83 NEW cov: 11197 ft: 17201 corp: 6/21b lim: 4 exec/s: 83 rss: 75Mb L: 4/4 MS: 1 CrossOver- 00:09:08.653 [2024-11-17 04:25:47.262872] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:08.653 [2024-11-17 04:25:47.262896] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:08.653 [2024-11-17 04:25:47.262913] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:08.653 #84 NEW cov: 11197 ft: 17317 corp: 7/25b lim: 4 exec/s: 84 rss: 75Mb L: 4/4 MS: 1 CrossOver- 00:09:08.653 [2024-11-17 04:25:47.458782] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:08.653 [2024-11-17 04:25:47.458805] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:08.653 [2024-11-17 04:25:47.458822] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:08.913 #85 NEW cov: 11197 ft: 17659 corp: 8/29b lim: 4 exec/s: 85 rss: 75Mb L: 4/4 MS: 1 ShuffleBytes- 00:09:08.913 [2024-11-17 04:25:47.649726] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:08.913 [2024-11-17 04:25:47.649749] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:08.913 [2024-11-17 04:25:47.649766] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:09.173 #91 NEW cov: 11204 ft: 18018 corp: 9/33b lim: 4 exec/s: 91 rss: 75Mb L: 4/4 MS: 1 CopyPart- 00:09:09.173 [2024-11-17 04:25:47.851768] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:09.173 [2024-11-17 04:25:47.851791] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:09.173 [2024-11-17 04:25:47.851808] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:09.173 #93 NEW cov: 11204 ft: 18482 corp: 10/37b lim: 4 exec/s: 46 rss: 75Mb L: 4/4 MS: 2 EraseBytes-CopyPart- 00:09:09.173 #93 DONE cov: 11204 ft: 18482 corp: 10/37b lim: 4 exec/s: 46 rss: 75Mb 00:09:09.173 ###### Recommended dictionary. ###### 00:09:09.173 "\001\005" # Uses: 3 00:09:09.173 ###### End of recommended dictionary. ###### 00:09:09.173 Done 93 runs in 2 second(s) 00:09:09.173 [2024-11-17 04:25:47.984900] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: disabling controller 00:09:09.433 04:25:48 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-1 /var/tmp/suppress_vfio_fuzz 00:09:09.433 04:25:48 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:09.433 04:25:48 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:09.433 04:25:48 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:09:09.434 04:25:48 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=2 00:09:09.434 04:25:48 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:09.434 04:25:48 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:09.434 04:25:48 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:09.434 04:25:48 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:09:09.434 04:25:48 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:09:09.434 04:25:48 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:09:09.434 04:25:48 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:09:09.434 04:25:48 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:09.434 04:25:48 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:09.434 04:25:48 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:09.434 04:25:48 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:09:09.434 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:09.434 04:25:48 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:09.434 04:25:48 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:09.434 04:25:48 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:09:09.434 [2024-11-17 04:25:48.242577] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:09:09.434 [2024-11-17 04:25:48.242647] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid160656 ] 00:09:09.694 [2024-11-17 04:25:48.336824] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:09.694 [2024-11-17 04:25:48.358787] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:09.954 INFO: Running with entropic power schedule (0xFF, 100). 00:09:09.954 INFO: Seed: 1403904523 00:09:09.954 INFO: Loaded 1 modules (384840 inline 8-bit counters): 384840 [0x2a4e00c, 0x2aabf54), 00:09:09.954 INFO: Loaded 1 PC tables (384840 PCs): 384840 [0x2aabf58,0x308b3d8), 00:09:09.954 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:09.954 INFO: A corpus is not provided, starting from an empty corpus 00:09:09.954 #2 INITED exec/s: 0 rss: 65Mb 00:09:09.954 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:09.954 This may also happen if the target rejected all inputs we tried so far 00:09:09.954 [2024-11-17 04:25:48.589294] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: enabling controller 00:09:09.954 [2024-11-17 04:25:48.657975] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:10.524 NEW_FUNC[1/673]: 0x453138 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:103 00:09:10.524 NEW_FUNC[2/673]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:10.524 #11 NEW cov: 11149 ft: 11105 corp: 2/9b lim: 8 exec/s: 0 rss: 71Mb L: 8/8 MS: 4 InsertByte-CopyPart-InsertRepeatedBytes-InsertByte- 00:09:10.524 [2024-11-17 04:25:49.142439] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:10.524 #26 NEW cov: 11163 ft: 13565 corp: 3/17b lim: 8 exec/s: 0 rss: 72Mb L: 8/8 MS: 5 InsertRepeatedBytes-EraseBytes-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:09:10.524 [2024-11-17 04:25:49.335651] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:10.524 [2024-11-17 04:25:49.335685] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:10.783 NEW_FUNC[1/2]: 0x159a848 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3098 00:09:10.783 NEW_FUNC[2/2]: 0x1c1a8e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:10.783 #27 NEW cov: 11190 ft: 15274 corp: 4/25b lim: 8 exec/s: 0 rss: 73Mb L: 8/8 MS: 1 ChangeBinInt- 00:09:10.783 [2024-11-17 04:25:49.531837] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:11.043 #28 NEW cov: 11190 ft: 16556 corp: 5/33b lim: 8 exec/s: 28 rss: 73Mb L: 8/8 MS: 1 CrossOver- 00:09:11.043 [2024-11-17 04:25:49.708216] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:11.043 #29 NEW cov: 11190 ft: 16997 corp: 6/41b lim: 8 exec/s: 29 rss: 75Mb L: 8/8 MS: 1 CopyPart- 00:09:11.304 [2024-11-17 04:25:49.883287] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:11.304 #30 NEW cov: 11190 ft: 17135 corp: 7/49b lim: 8 exec/s: 30 rss: 75Mb L: 8/8 MS: 1 ShuffleBytes- 00:09:11.304 [2024-11-17 04:25:50.056547] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:11.304 [2024-11-17 04:25:50.056580] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:11.564 #31 NEW cov: 11190 ft: 17481 corp: 8/57b lim: 8 exec/s: 31 rss: 75Mb L: 8/8 MS: 1 ChangeBit- 00:09:11.564 [2024-11-17 04:25:50.235901] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:11.564 #32 NEW cov: 11190 ft: 17583 corp: 9/65b lim: 8 exec/s: 32 rss: 75Mb L: 8/8 MS: 1 ChangeBit- 00:09:11.824 [2024-11-17 04:25:50.414495] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:11.824 #33 NEW cov: 11197 ft: 17625 corp: 10/73b lim: 8 exec/s: 33 rss: 75Mb L: 8/8 MS: 1 ShuffleBytes- 00:09:11.824 [2024-11-17 04:25:50.593050] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:12.084 #34 NEW cov: 11197 ft: 18146 corp: 11/81b lim: 8 exec/s: 17 rss: 75Mb L: 8/8 MS: 1 ShuffleBytes- 00:09:12.084 #34 DONE cov: 11197 ft: 18146 corp: 11/81b lim: 8 exec/s: 17 rss: 75Mb 00:09:12.084 Done 34 runs in 2 second(s) 00:09:12.085 [2024-11-17 04:25:50.718915] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: disabling controller 00:09:12.345 04:25:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-2 /var/tmp/suppress_vfio_fuzz 00:09:12.345 04:25:50 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:12.345 04:25:50 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:12.345 04:25:50 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:09:12.345 04:25:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=3 00:09:12.345 04:25:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:12.345 04:25:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:12.345 04:25:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:12.345 04:25:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:09:12.345 04:25:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:09:12.345 04:25:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:09:12.345 04:25:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:09:12.345 04:25:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:12.345 04:25:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:12.345 04:25:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:12.345 04:25:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:09:12.345 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:12.345 04:25:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:12.345 04:25:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:12.345 04:25:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:09:12.345 [2024-11-17 04:25:50.987889] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:09:12.345 [2024-11-17 04:25:50.987954] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid161192 ] 00:09:12.345 [2024-11-17 04:25:51.082154] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:12.345 [2024-11-17 04:25:51.104138] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:12.606 INFO: Running with entropic power schedule (0xFF, 100). 00:09:12.606 INFO: Seed: 4154921213 00:09:12.606 INFO: Loaded 1 modules (384840 inline 8-bit counters): 384840 [0x2a4e00c, 0x2aabf54), 00:09:12.606 INFO: Loaded 1 PC tables (384840 PCs): 384840 [0x2aabf58,0x308b3d8), 00:09:12.606 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:12.606 INFO: A corpus is not provided, starting from an empty corpus 00:09:12.606 #2 INITED exec/s: 0 rss: 66Mb 00:09:12.606 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:12.606 This may also happen if the target rejected all inputs we tried so far 00:09:12.606 [2024-11-17 04:25:51.351981] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: enabling controller 00:09:13.126 NEW_FUNC[1/673]: 0x453828 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:124 00:09:13.126 NEW_FUNC[2/673]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:13.126 #51 NEW cov: 11150 ft: 10737 corp: 2/33b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 4 ChangeByte-InsertRepeatedBytes-ChangeBinInt-InsertRepeatedBytes- 00:09:13.386 #52 NEW cov: 11164 ft: 14071 corp: 3/65b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:13.386 NEW_FUNC[1/1]: 0x1c1a8e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:13.386 #73 NEW cov: 11184 ft: 15925 corp: 4/97b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 ChangeBit- 00:09:13.647 #74 NEW cov: 11184 ft: 16379 corp: 5/129b lim: 32 exec/s: 74 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:09:13.907 #90 NEW cov: 11184 ft: 16664 corp: 6/161b lim: 32 exec/s: 90 rss: 74Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:14.167 #91 NEW cov: 11184 ft: 16859 corp: 7/193b lim: 32 exec/s: 91 rss: 74Mb L: 32/32 MS: 1 ChangeBit- 00:09:14.167 #92 NEW cov: 11184 ft: 17349 corp: 8/225b lim: 32 exec/s: 92 rss: 74Mb L: 32/32 MS: 1 ChangeByte- 00:09:14.428 #93 NEW cov: 11191 ft: 17402 corp: 9/257b lim: 32 exec/s: 93 rss: 74Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:14.688 #94 NEW cov: 11191 ft: 17748 corp: 10/289b lim: 32 exec/s: 47 rss: 74Mb L: 32/32 MS: 1 ChangeBit- 00:09:14.688 #94 DONE cov: 11191 ft: 17748 corp: 10/289b lim: 32 exec/s: 47 rss: 74Mb 00:09:14.688 Done 94 runs in 2 second(s) 00:09:14.688 [2024-11-17 04:25:53.371890] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: disabling controller 00:09:14.949 04:25:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-3 /var/tmp/suppress_vfio_fuzz 00:09:14.949 04:25:53 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:14.949 04:25:53 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:14.949 04:25:53 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:09:14.949 04:25:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=4 00:09:14.949 04:25:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:14.949 04:25:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:14.949 04:25:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:14.949 04:25:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:09:14.949 04:25:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:09:14.949 04:25:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:09:14.949 04:25:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:09:14.949 04:25:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:14.949 04:25:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:14.949 04:25:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:14.949 04:25:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:09:14.949 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:14.949 04:25:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:14.949 04:25:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:14.949 04:25:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:09:14.949 [2024-11-17 04:25:53.630038] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:09:14.949 [2024-11-17 04:25:53.630105] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid161567 ] 00:09:14.949 [2024-11-17 04:25:53.726584] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:14.949 [2024-11-17 04:25:53.748969] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:15.209 INFO: Running with entropic power schedule (0xFF, 100). 00:09:15.209 INFO: Seed: 2497949099 00:09:15.209 INFO: Loaded 1 modules (384840 inline 8-bit counters): 384840 [0x2a4e00c, 0x2aabf54), 00:09:15.209 INFO: Loaded 1 PC tables (384840 PCs): 384840 [0x2aabf58,0x308b3d8), 00:09:15.209 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:15.209 INFO: A corpus is not provided, starting from an empty corpus 00:09:15.209 #2 INITED exec/s: 0 rss: 65Mb 00:09:15.209 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:15.209 This may also happen if the target rejected all inputs we tried so far 00:09:15.209 [2024-11-17 04:25:53.978437] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: enabling controller 00:09:15.728 NEW_FUNC[1/673]: 0x4540a8 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:144 00:09:15.728 NEW_FUNC[2/673]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:15.728 #81 NEW cov: 11158 ft: 10770 corp: 2/33b lim: 32 exec/s: 0 rss: 70Mb L: 32/32 MS: 4 CopyPart-InsertRepeatedBytes-ChangeBit-CopyPart- 00:09:15.986 #92 NEW cov: 11173 ft: 13721 corp: 3/65b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 ChangeBit- 00:09:15.986 NEW_FUNC[1/1]: 0x1c1a8e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:15.986 #98 NEW cov: 11190 ft: 15227 corp: 4/97b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 CrossOver- 00:09:16.245 #99 NEW cov: 11190 ft: 15547 corp: 5/129b lim: 32 exec/s: 99 rss: 73Mb L: 32/32 MS: 1 ChangeByte- 00:09:16.505 #100 NEW cov: 11190 ft: 16721 corp: 6/161b lim: 32 exec/s: 100 rss: 73Mb L: 32/32 MS: 1 CopyPart- 00:09:16.765 #106 NEW cov: 11190 ft: 17051 corp: 7/193b lim: 32 exec/s: 106 rss: 73Mb L: 32/32 MS: 1 ChangeByte- 00:09:16.765 #107 NEW cov: 11190 ft: 17214 corp: 8/225b lim: 32 exec/s: 107 rss: 73Mb L: 32/32 MS: 1 ChangeByte- 00:09:17.024 #118 NEW cov: 11190 ft: 17405 corp: 9/257b lim: 32 exec/s: 118 rss: 73Mb L: 32/32 MS: 1 CrossOver- 00:09:17.284 #119 NEW cov: 11197 ft: 17646 corp: 10/289b lim: 32 exec/s: 119 rss: 73Mb L: 32/32 MS: 1 ChangeByte- 00:09:17.284 #120 NEW cov: 11197 ft: 18511 corp: 11/321b lim: 32 exec/s: 60 rss: 73Mb L: 32/32 MS: 1 CopyPart- 00:09:17.284 #120 DONE cov: 11197 ft: 18511 corp: 11/321b lim: 32 exec/s: 60 rss: 73Mb 00:09:17.284 Done 120 runs in 2 second(s) 00:09:17.284 [2024-11-17 04:25:56.110900] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: disabling controller 00:09:17.545 04:25:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-4 /var/tmp/suppress_vfio_fuzz 00:09:17.545 04:25:56 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:17.545 04:25:56 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:17.545 04:25:56 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:09:17.545 04:25:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=5 00:09:17.545 04:25:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:17.545 04:25:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:17.545 04:25:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:17.545 04:25:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:09:17.545 04:25:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:09:17.545 04:25:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:09:17.545 04:25:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:09:17.545 04:25:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:17.545 04:25:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:17.545 04:25:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:17.545 04:25:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:09:17.545 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:17.545 04:25:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:17.546 04:25:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:17.546 04:25:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:09:17.546 [2024-11-17 04:25:56.364824] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:09:17.546 [2024-11-17 04:25:56.364894] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid162021 ] 00:09:17.806 [2024-11-17 04:25:56.458053] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:17.806 [2024-11-17 04:25:56.480091] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:18.065 INFO: Running with entropic power schedule (0xFF, 100). 00:09:18.065 INFO: Seed: 944967315 00:09:18.065 INFO: Loaded 1 modules (384840 inline 8-bit counters): 384840 [0x2a4e00c, 0x2aabf54), 00:09:18.065 INFO: Loaded 1 PC tables (384840 PCs): 384840 [0x2aabf58,0x308b3d8), 00:09:18.065 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:18.065 INFO: A corpus is not provided, starting from an empty corpus 00:09:18.065 #2 INITED exec/s: 0 rss: 65Mb 00:09:18.065 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:18.065 This may also happen if the target rejected all inputs we tried so far 00:09:18.065 [2024-11-17 04:25:56.720396] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: enabling controller 00:09:18.065 [2024-11-17 04:25:56.791914] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:18.065 [2024-11-17 04:25:56.791948] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:18.585 NEW_FUNC[1/674]: 0x454aa8 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:171 00:09:18.585 NEW_FUNC[2/674]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:18.585 #74 NEW cov: 11165 ft: 10744 corp: 2/14b lim: 13 exec/s: 0 rss: 71Mb L: 13/13 MS: 2 CrossOver-InsertRepeatedBytes- 00:09:18.585 [2024-11-17 04:25:57.276986] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:18.585 [2024-11-17 04:25:57.277027] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:18.585 #90 NEW cov: 11182 ft: 13785 corp: 3/27b lim: 13 exec/s: 0 rss: 72Mb L: 13/13 MS: 1 ChangeBit- 00:09:18.844 [2024-11-17 04:25:57.454584] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:18.844 [2024-11-17 04:25:57.454614] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:18.844 NEW_FUNC[1/1]: 0x1c1a8e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:18.844 #91 NEW cov: 11199 ft: 15648 corp: 4/40b lim: 13 exec/s: 0 rss: 73Mb L: 13/13 MS: 1 ChangeBit- 00:09:18.844 [2024-11-17 04:25:57.652002] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:18.844 [2024-11-17 04:25:57.652033] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:19.104 #94 NEW cov: 11199 ft: 16181 corp: 5/53b lim: 13 exec/s: 94 rss: 73Mb L: 13/13 MS: 3 CrossOver-InsertByte-CrossOver- 00:09:19.104 [2024-11-17 04:25:57.840461] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:19.104 [2024-11-17 04:25:57.840490] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:19.363 #100 NEW cov: 11199 ft: 16558 corp: 6/66b lim: 13 exec/s: 100 rss: 73Mb L: 13/13 MS: 1 ShuffleBytes- 00:09:19.363 [2024-11-17 04:25:58.019077] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:19.364 [2024-11-17 04:25:58.019108] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:19.364 #111 NEW cov: 11199 ft: 16871 corp: 7/79b lim: 13 exec/s: 111 rss: 73Mb L: 13/13 MS: 1 ChangeBinInt- 00:09:19.623 [2024-11-17 04:25:58.193664] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:19.623 [2024-11-17 04:25:58.193699] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:19.623 #112 NEW cov: 11199 ft: 17395 corp: 8/92b lim: 13 exec/s: 112 rss: 73Mb L: 13/13 MS: 1 CopyPart- 00:09:19.623 [2024-11-17 04:25:58.373936] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:19.623 [2024-11-17 04:25:58.373965] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:19.883 #113 NEW cov: 11199 ft: 17567 corp: 9/105b lim: 13 exec/s: 113 rss: 73Mb L: 13/13 MS: 1 ChangeByte- 00:09:19.883 [2024-11-17 04:25:58.552459] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:19.883 [2024-11-17 04:25:58.552488] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:19.883 #116 NEW cov: 11206 ft: 17620 corp: 10/118b lim: 13 exec/s: 116 rss: 73Mb L: 13/13 MS: 3 CrossOver-InsertRepeatedBytes-InsertByte- 00:09:20.143 [2024-11-17 04:25:58.733080] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:20.143 [2024-11-17 04:25:58.733111] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:20.143 #117 NEW cov: 11206 ft: 17995 corp: 11/131b lim: 13 exec/s: 58 rss: 73Mb L: 13/13 MS: 1 ShuffleBytes- 00:09:20.143 #117 DONE cov: 11206 ft: 17995 corp: 11/131b lim: 13 exec/s: 58 rss: 73Mb 00:09:20.143 Done 117 runs in 2 second(s) 00:09:20.143 [2024-11-17 04:25:58.855882] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: disabling controller 00:09:20.403 04:25:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-5 /var/tmp/suppress_vfio_fuzz 00:09:20.403 04:25:59 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:20.403 04:25:59 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:20.403 04:25:59 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:09:20.403 04:25:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=6 00:09:20.403 04:25:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:20.403 04:25:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:20.403 04:25:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:20.403 04:25:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:09:20.403 04:25:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:09:20.403 04:25:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:09:20.403 04:25:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:09:20.403 04:25:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:20.403 04:25:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:20.403 04:25:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:20.403 04:25:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:09:20.403 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:20.403 04:25:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:20.403 04:25:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:20.403 04:25:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:09:20.403 [2024-11-17 04:25:59.114174] Starting SPDK v25.01-pre git sha1 83e8405e4 / DPDK 22.11.4 initialization... 00:09:20.403 [2024-11-17 04:25:59.114268] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid162551 ] 00:09:20.403 [2024-11-17 04:25:59.210892] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:20.663 [2024-11-17 04:25:59.233615] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:20.663 INFO: Running with entropic power schedule (0xFF, 100). 00:09:20.663 INFO: Seed: 3685972170 00:09:20.663 INFO: Loaded 1 modules (384840 inline 8-bit counters): 384840 [0x2a4e00c, 0x2aabf54), 00:09:20.663 INFO: Loaded 1 PC tables (384840 PCs): 384840 [0x2aabf58,0x308b3d8), 00:09:20.663 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:20.663 INFO: A corpus is not provided, starting from an empty corpus 00:09:20.663 #2 INITED exec/s: 0 rss: 65Mb 00:09:20.663 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:20.663 This may also happen if the target rejected all inputs we tried so far 00:09:20.663 [2024-11-17 04:25:59.471042] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: enabling controller 00:09:20.922 [2024-11-17 04:25:59.538899] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:20.922 [2024-11-17 04:25:59.538932] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:21.182 NEW_FUNC[1/674]: 0x455798 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:09:21.182 NEW_FUNC[2/674]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:21.182 #27 NEW cov: 11155 ft: 10711 corp: 2/10b lim: 9 exec/s: 0 rss: 71Mb L: 9/9 MS: 5 CrossOver-ChangeBinInt-ChangeBinInt-InsertRepeatedBytes-CopyPart- 00:09:21.442 [2024-11-17 04:26:00.018730] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:21.442 [2024-11-17 04:26:00.018773] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:21.442 #38 NEW cov: 11170 ft: 13934 corp: 3/19b lim: 9 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 CMP- DE: "\000\000\000\000\005h\364f"- 00:09:21.442 [2024-11-17 04:26:00.213019] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:21.442 [2024-11-17 04:26:00.213058] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:21.700 NEW_FUNC[1/1]: 0x1c1a8e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:21.700 #44 NEW cov: 11187 ft: 16010 corp: 4/28b lim: 9 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 ChangeBit- 00:09:21.701 [2024-11-17 04:26:00.404965] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:21.701 [2024-11-17 04:26:00.405000] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:21.701 #45 NEW cov: 11187 ft: 16405 corp: 5/37b lim: 9 exec/s: 45 rss: 73Mb L: 9/9 MS: 1 CopyPart- 00:09:21.959 [2024-11-17 04:26:00.580151] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:21.959 [2024-11-17 04:26:00.580181] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:21.959 #46 NEW cov: 11187 ft: 17237 corp: 6/46b lim: 9 exec/s: 46 rss: 75Mb L: 9/9 MS: 1 ShuffleBytes- 00:09:21.959 [2024-11-17 04:26:00.751704] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:21.959 [2024-11-17 04:26:00.751734] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:22.218 #47 NEW cov: 11187 ft: 17364 corp: 7/55b lim: 9 exec/s: 47 rss: 75Mb L: 9/9 MS: 1 ChangeBinInt- 00:09:22.218 [2024-11-17 04:26:00.927993] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:22.218 [2024-11-17 04:26:00.928023] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:22.218 #48 NEW cov: 11187 ft: 17834 corp: 8/64b lim: 9 exec/s: 48 rss: 75Mb L: 9/9 MS: 1 ChangeBit- 00:09:22.478 [2024-11-17 04:26:01.099337] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:22.478 [2024-11-17 04:26:01.099371] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:22.478 #49 NEW cov: 11187 ft: 17906 corp: 9/73b lim: 9 exec/s: 49 rss: 75Mb L: 9/9 MS: 1 CMP- DE: "\002\000\000\000"- 00:09:22.478 [2024-11-17 04:26:01.275145] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:22.478 [2024-11-17 04:26:01.275176] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:22.737 #50 NEW cov: 11194 ft: 18228 corp: 10/82b lim: 9 exec/s: 50 rss: 75Mb L: 9/9 MS: 1 ChangeBit- 00:09:22.737 [2024-11-17 04:26:01.453564] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:22.737 [2024-11-17 04:26:01.453596] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:22.737 #51 NEW cov: 11194 ft: 18591 corp: 11/91b lim: 9 exec/s: 25 rss: 75Mb L: 9/9 MS: 1 ChangeBinInt- 00:09:22.737 #51 DONE cov: 11194 ft: 18591 corp: 11/91b lim: 9 exec/s: 25 rss: 75Mb 00:09:22.737 ###### Recommended dictionary. ###### 00:09:22.737 "\000\000\000\000\005h\364f" # Uses: 0 00:09:22.737 "\002\000\000\000" # Uses: 0 00:09:22.737 ###### End of recommended dictionary. ###### 00:09:22.737 Done 51 runs in 2 second(s) 00:09:22.996 [2024-11-17 04:26:01.578897] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: disabling controller 00:09:22.996 04:26:01 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-6 /var/tmp/suppress_vfio_fuzz 00:09:22.996 04:26:01 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:22.996 04:26:01 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:22.996 04:26:01 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:09:22.996 00:09:22.996 real 0m19.555s 00:09:22.996 user 0m27.720s 00:09:22.996 sys 0m1.979s 00:09:22.996 04:26:01 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:22.996 04:26:01 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:22.996 ************************************ 00:09:22.996 END TEST vfio_llvm_fuzz 00:09:22.996 ************************************ 00:09:23.256 00:09:23.256 real 1m23.016s 00:09:23.256 user 2m7.196s 00:09:23.256 sys 0m9.879s 00:09:23.256 04:26:01 llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:23.256 04:26:01 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:23.256 ************************************ 00:09:23.256 END TEST llvm_fuzz 00:09:23.256 ************************************ 00:09:23.256 04:26:01 -- spdk/autotest.sh@378 -- # [[ '' -eq 1 ]] 00:09:23.256 04:26:01 -- spdk/autotest.sh@385 -- # trap - SIGINT SIGTERM EXIT 00:09:23.256 04:26:01 -- spdk/autotest.sh@387 -- # timing_enter post_cleanup 00:09:23.256 04:26:01 -- common/autotest_common.sh@726 -- # xtrace_disable 00:09:23.256 04:26:01 -- common/autotest_common.sh@10 -- # set +x 00:09:23.256 04:26:01 -- spdk/autotest.sh@388 -- # autotest_cleanup 00:09:23.256 04:26:01 -- common/autotest_common.sh@1396 -- # local autotest_es=0 00:09:23.256 04:26:01 -- common/autotest_common.sh@1397 -- # xtrace_disable 00:09:23.256 04:26:01 -- common/autotest_common.sh@10 -- # set +x 00:09:29.906 INFO: APP EXITING 00:09:29.906 INFO: killing all VMs 00:09:29.906 INFO: killing vhost app 00:09:29.906 INFO: EXIT DONE 00:09:33.423 Waiting for block devices as requested 00:09:33.423 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:33.423 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:33.423 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:33.423 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:33.423 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:33.423 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:33.423 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:33.423 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:33.690 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:33.690 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:33.690 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:33.958 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:33.958 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:33.958 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:34.228 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:34.228 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:34.228 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:38.537 Cleaning 00:09:38.537 Removing: /dev/shm/spdk_tgt_trace.pid134843 00:09:38.537 Removing: /var/run/dpdk/spdk_pid132372 00:09:38.537 Removing: /var/run/dpdk/spdk_pid133498 00:09:38.537 Removing: /var/run/dpdk/spdk_pid134843 00:09:38.537 Removing: /var/run/dpdk/spdk_pid135303 00:09:38.537 Removing: /var/run/dpdk/spdk_pid136380 00:09:38.537 Removing: /var/run/dpdk/spdk_pid136405 00:09:38.537 Removing: /var/run/dpdk/spdk_pid137519 00:09:38.537 Removing: /var/run/dpdk/spdk_pid137531 00:09:38.537 Removing: /var/run/dpdk/spdk_pid137961 00:09:38.537 Removing: /var/run/dpdk/spdk_pid138294 00:09:38.537 Removing: /var/run/dpdk/spdk_pid138615 00:09:38.537 Removing: /var/run/dpdk/spdk_pid138904 00:09:38.537 Removing: /var/run/dpdk/spdk_pid139032 00:09:38.537 Removing: /var/run/dpdk/spdk_pid139313 00:09:38.537 Removing: /var/run/dpdk/spdk_pid139601 00:09:38.537 Removing: /var/run/dpdk/spdk_pid139919 00:09:38.537 Removing: /var/run/dpdk/spdk_pid140889 00:09:38.537 Removing: /var/run/dpdk/spdk_pid144408 00:09:38.537 Removing: /var/run/dpdk/spdk_pid144551 00:09:38.537 Removing: /var/run/dpdk/spdk_pid144839 00:09:38.537 Removing: /var/run/dpdk/spdk_pid144846 00:09:38.537 Removing: /var/run/dpdk/spdk_pid145410 00:09:38.537 Removing: /var/run/dpdk/spdk_pid145416 00:09:38.537 Removing: /var/run/dpdk/spdk_pid145976 00:09:38.537 Removing: /var/run/dpdk/spdk_pid145985 00:09:38.537 Removing: /var/run/dpdk/spdk_pid146287 00:09:38.537 Removing: /var/run/dpdk/spdk_pid146377 00:09:38.537 Removing: /var/run/dpdk/spdk_pid146582 00:09:38.537 Removing: /var/run/dpdk/spdk_pid146638 00:09:38.537 Removing: /var/run/dpdk/spdk_pid147226 00:09:38.537 Removing: /var/run/dpdk/spdk_pid147508 00:09:38.537 Removing: /var/run/dpdk/spdk_pid147718 00:09:38.537 Removing: /var/run/dpdk/spdk_pid147873 00:09:38.537 Removing: /var/run/dpdk/spdk_pid148622 00:09:38.537 Removing: /var/run/dpdk/spdk_pid148929 00:09:38.537 Removing: /var/run/dpdk/spdk_pid149443 00:09:38.537 Removing: /var/run/dpdk/spdk_pid149914 00:09:38.537 Removing: /var/run/dpdk/spdk_pid150269 00:09:38.537 Removing: /var/run/dpdk/spdk_pid150802 00:09:38.537 Removing: /var/run/dpdk/spdk_pid151156 00:09:38.537 Removing: /var/run/dpdk/spdk_pid151616 00:09:38.537 Removing: /var/run/dpdk/spdk_pid152121 00:09:38.537 Removing: /var/run/dpdk/spdk_pid152437 00:09:38.537 Removing: /var/run/dpdk/spdk_pid152973 00:09:38.537 Removing: /var/run/dpdk/spdk_pid153371 00:09:38.537 Removing: /var/run/dpdk/spdk_pid153792 00:09:38.537 Removing: /var/run/dpdk/spdk_pid154321 00:09:38.537 Removing: /var/run/dpdk/spdk_pid154617 00:09:38.537 Removing: /var/run/dpdk/spdk_pid155147 00:09:38.537 Removing: /var/run/dpdk/spdk_pid155582 00:09:38.537 Removing: /var/run/dpdk/spdk_pid155966 00:09:38.537 Removing: /var/run/dpdk/spdk_pid156495 00:09:38.537 Removing: /var/run/dpdk/spdk_pid156801 00:09:38.537 Removing: /var/run/dpdk/spdk_pid157320 00:09:38.537 Removing: /var/run/dpdk/spdk_pid157797 00:09:38.537 Removing: /var/run/dpdk/spdk_pid158137 00:09:38.537 Removing: /var/run/dpdk/spdk_pid158664 00:09:38.537 Removing: /var/run/dpdk/spdk_pid159056 00:09:38.537 Removing: /var/run/dpdk/spdk_pid159656 00:09:38.537 Removing: /var/run/dpdk/spdk_pid160119 00:09:38.537 Removing: /var/run/dpdk/spdk_pid160656 00:09:38.537 Removing: /var/run/dpdk/spdk_pid161192 00:09:38.537 Removing: /var/run/dpdk/spdk_pid161567 00:09:38.537 Removing: /var/run/dpdk/spdk_pid162021 00:09:38.537 Removing: /var/run/dpdk/spdk_pid162551 00:09:38.537 Clean 00:09:38.537 04:26:16 -- common/autotest_common.sh@1453 -- # return 0 00:09:38.537 04:26:16 -- spdk/autotest.sh@389 -- # timing_exit post_cleanup 00:09:38.537 04:26:16 -- common/autotest_common.sh@732 -- # xtrace_disable 00:09:38.537 04:26:16 -- common/autotest_common.sh@10 -- # set +x 00:09:38.537 04:26:16 -- spdk/autotest.sh@391 -- # timing_exit autotest 00:09:38.537 04:26:16 -- common/autotest_common.sh@732 -- # xtrace_disable 00:09:38.537 04:26:16 -- common/autotest_common.sh@10 -- # set +x 00:09:38.537 04:26:16 -- spdk/autotest.sh@392 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:38.537 04:26:17 -- spdk/autotest.sh@394 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:38.537 04:26:17 -- spdk/autotest.sh@394 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:38.537 04:26:17 -- spdk/autotest.sh@396 -- # [[ y == y ]] 00:09:38.537 04:26:17 -- spdk/autotest.sh@398 -- # hostname 00:09:38.537 04:26:17 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:09:38.537 geninfo: WARNING: invalid characters removed from testname! 00:09:44.021 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcda 00:09:47.472 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcda 00:09:51.673 04:26:29 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:59.809 04:26:37 -- spdk/autotest.sh@400 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:04.024 04:26:42 -- spdk/autotest.sh@404 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:09.303 04:26:47 -- spdk/autotest.sh@405 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:14.583 04:26:53 -- spdk/autotest.sh@406 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:19.863 04:26:58 -- spdk/autotest.sh@407 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:25.141 04:27:03 -- spdk/autotest.sh@408 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:10:25.141 04:27:03 -- spdk/autorun.sh@1 -- $ timing_finish 00:10:25.141 04:27:03 -- common/autotest_common.sh@738 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt ]] 00:10:25.141 04:27:03 -- common/autotest_common.sh@740 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:10:25.141 04:27:03 -- common/autotest_common.sh@741 -- $ [[ -x /usr/local/FlameGraph/flamegraph.pl ]] 00:10:25.141 04:27:03 -- common/autotest_common.sh@744 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:10:25.141 + [[ -n 6180 ]] 00:10:25.141 + sudo kill 6180 00:10:25.412 [Pipeline] } 00:10:25.428 [Pipeline] // stage 00:10:25.432 [Pipeline] } 00:10:25.447 [Pipeline] // timeout 00:10:25.452 [Pipeline] } 00:10:25.466 [Pipeline] // catchError 00:10:25.472 [Pipeline] } 00:10:25.486 [Pipeline] // wrap 00:10:25.492 [Pipeline] } 00:10:25.505 [Pipeline] // catchError 00:10:25.514 [Pipeline] stage 00:10:25.516 [Pipeline] { (Epilogue) 00:10:25.529 [Pipeline] catchError 00:10:25.531 [Pipeline] { 00:10:25.544 [Pipeline] echo 00:10:25.545 Cleanup processes 00:10:25.550 [Pipeline] sh 00:10:25.839 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:25.839 171370 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:25.855 [Pipeline] sh 00:10:26.144 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:26.144 ++ grep -v 'sudo pgrep' 00:10:26.144 ++ awk '{print $1}' 00:10:26.144 + sudo kill -9 00:10:26.144 + true 00:10:26.157 [Pipeline] sh 00:10:26.446 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:10:26.446 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:26.446 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:27.827 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:40.061 [Pipeline] sh 00:10:40.350 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:40.350 Artifacts sizes are good 00:10:40.365 [Pipeline] archiveArtifacts 00:10:40.372 Archiving artifacts 00:10:40.765 [Pipeline] sh 00:10:41.054 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:41.070 [Pipeline] cleanWs 00:10:41.080 [WS-CLEANUP] Deleting project workspace... 00:10:41.080 [WS-CLEANUP] Deferred wipeout is used... 00:10:41.087 [WS-CLEANUP] done 00:10:41.089 [Pipeline] } 00:10:41.105 [Pipeline] // catchError 00:10:41.116 [Pipeline] sh 00:10:41.402 + logger -p user.info -t JENKINS-CI 00:10:41.412 [Pipeline] } 00:10:41.425 [Pipeline] // stage 00:10:41.430 [Pipeline] } 00:10:41.444 [Pipeline] // node 00:10:41.449 [Pipeline] End of Pipeline 00:10:41.488 Finished: SUCCESS